Presented by
Professor Teresa Iacono
La Trobe Rural Health School, La Trobe University
Text
- Ensure measurement is preformend in a consistent fashion
- Intra-rater reliability
- Degree of agreement among multiple ratings performed by a single rater.
- Often involves coding and then re-coding the same video.
- Inter-rater reliability (note that in the video this was mistakenly described as ‘intra-rater’ reliability)
- Degree of agreement across two or more raters.
In the case of Jonathan, we could ask his parents to each code a video of Jonathan during mealtime, and then calculate the extent to which they agree as a percentage (‘agreements’ divided by ‘agreements + disagreements’).
Activity
Think about, and write down your plan for measuring reliability. The challenge is to devise a way of doing it, that will lead to minimal increase in workload, while at the same time ensuring you can demonstrate the data you are recording are reliable. You may like to consider:
- Using your phone or a camera to record, and then the review, part of the session (intra-rater reliability)
- Asking a colleague, parent, or student to record data at the same time as you (inter-rater reliability)
In research, it is common for researchers to conduct reliability checks on approximately 20% of all behaviour coding.