Comparison of assessments | Inter-rater reliability | Statistical data types

Inter-rater reliability

In statistics, inter-rater reliability (also called by various similar names, such as inter-rater agreement, inter-rater concordance, inter-observer reliability, inter-coder reliability, and so on) is the degree of agreement among independent observers who rate, code, or assess the same phenomenon. Assessment tools that rely on ratings must exhibit good inter-rater reliability, otherwise they are not valid tests. There are a number of statistics that can be used to determine inter-rater reliability. Different statistics are appropriate for different types of measurement. Some options are joint-probability of agreement, such as Cohen's kappa, Scott's pi and Fleiss' kappa; or inter-rater correlation, concordance correlation coefficient, intra-class correlation, and Krippendorff's alpha. (Wikipedia).

Inter-rater reliability
Video thumbnail

Reliability 1: External reliability and rater reliability and agreement

In this video, I discuss external reliability, inter- and intra-rater reliability, and rater agreement.

From playlist Reliability analysis

Video thumbnail

The Range & Interquartile Range – Two Simple Measures of Variability (6-2)

We will begin with two simple measures of variability: the range and the interquartile range. Range is the variability between the two most extreme scores in the distribution. It is the difference between the largest and smallest scores. Outliers can affect the range, but not the interquar

From playlist WK6 Measures of Variability - Online Statistics for the Flipped Classroom

Video thumbnail

SPSS - Reliability Analysis Example

Lecturer: Katherine Miller Missouri State University Fall 2015 This video covers how to run Cronbach's alpha in SPSS for reliability. Lecture materials and assignments available at statisticsofdoom.com. https://statisticsofdoom.com/page/basic-statistics/

From playlist Intermediate Statistics Videos

Video thumbnail

Statistics - Compute the interquartile range

This video shows how to compute the interquartile range for a set of data. Remember to reorganize the data so that you can find the median values easier. For more videos visit http://mysecretmathtutor.com

From playlist Statistics

Video thumbnail

Finding Outliers using Interquartile Range | Statistics, IQR, Quartiles

How do we find outliers of a data set using the interquartile range? This is done using a simple rule, any value less than Q1-1.5*IQR is an outlier, and any value greater than Q3+1.5*IQR is an outlier. We'll go through the step by step process of finding outliers using IQR in today's video

From playlist Statistics

Video thumbnail

Many-Facet Rasch Measurement Using Facets Software

In this video, I demonstrate how to carry out a Many-Facet Rasch Measurement Using Facets Software. If you have not watched the previous videos on Rasch measurement, they can be found from the links below: 1. https://www.youtube.com/watch?v=FDUYm7ZhXkw&t=49s 2. https://www.youtube.com/watc

From playlist Rasch Measurement

Video thumbnail

JASP 0.16 Tutorial: Inter-Item Correlation Reliability Module (Episode 40)

In this JASP tutorial, I explore briefly the new options and calculations of the Inter-item correlation coefficient (ICC) for reliability analysis! The data in this video can be found in the base JASP Data Library. JASP: https://jasp-stats.org NOTE: This tutorial uses the new preview/be

From playlist JASP Tutorials

Video thumbnail

Reliability 4: Cohen's Kappa and inter-rater agreement

In this video, I discuss Cohen's Kappa and inter-rater agreement. I will demonstrate how to compute these in SPSS and excel and make sense of the output. If you are interested to see how Cohen's Kappa compares with McNemar's tests, please watch the following video: https://youtu.be/va-Tjv

From playlist Cohen's Kappa

Video thumbnail

Writing & Assessment (6)

In this series of videos, I will discuss some of the fundamental principles writing and writing assessment.

From playlist What is Writing?

Video thumbnail

Statistics - How to find outliers

This video covers how to find outliers in your data. Remember that an outlier is an extremely high, or extremely low value. We determine extreme by being 1.5 times the interquartile range above Q3 or below Q1. For more videos visit http://www.mysecretmathtutor.com

From playlist Statistics

Video thumbnail

IP Security: Part 1

Fundamental concepts of IPSec are discussed. Authentication Header is explained. ESP & IKE are analyzed.

From playlist Network Security

Video thumbnail

MAE915_Week 8_Assessing Writing (Practice)_30/09/2021

To support the channel, I would like to invite you to join this channel to get access to perks: https://www.youtube.com/channel/UCfu2GCdjq50W-kL-cv3rcLw/join

From playlist Language Assessment & Technology

Video thumbnail

Inter-rater reliability analysis using McNemar test and Cohen's Kappa in SPSS

McNemar test and Cohen's Kappa test are used to measure the disagreement and agreement between raters. In this video, I demonstrate how to run each of them in SPSS and how their results compare. I also discuss the limitation of McNemar test.

From playlist Cohen's Kappa

Video thumbnail

JASP 0.17 Tutorial: Intraclass Correlation [ICC] Analysis (Episode 47)

In this JASP video, I show you how to perform an Intraclass Correlation (ICC), another reliability analysis that can assist when trying to figure out if judges or measurements are in agreement. It is found under the Reliability Module! JASP: https://jasp-stats.org NOTE: This tutorial use

From playlist JASP Tutorials

Video thumbnail

Evaluation 11: interpolated recall-precision plot

Recall-precision graphs are the standard way to compare search algorithms. To construct a standard recall-precision graph, we interpolate precision values, and average them over a large set of queries. The standard interpolation strategy is based on the assumption that precision always dec

From playlist IR13 Evaluating Search Engines

Video thumbnail

Transport Layer Security: Part 1

Fundamental concepts of TLS are discussed. SSL is analyzed. HTTPS & SSH are presented.

From playlist Network Security

Video thumbnail

JASP 0.17 Tutorial: Rater Agreement Analysis (Episode 46)

In this JASP video, I show you how to perform a Rater Agreement (Interrater Reliability) in JASP. The analysis is found under the Reliability Module and can be used to quickly determine agreement among a set of judges using Cohen's kappa, Fleiss' kappa, and Krippendorf's alpha. JASP: http

From playlist JASP Tutorials

Video thumbnail

Jamovi 1.8/2.0 Tutorial: SeolMatrix Add-on Module (Episode 39)

In this Jamovi tutorial, I discuss a new add-on module to Jamovi called SeolMatrix (https://github.com/hyunsooseol/seolmatrix), created by Hyunsoo Seol! This package contains several specific types of correlations, including polychoric, spearman, and partial correlations. I compare its par

From playlist Jamovi Tutorials

Video thumbnail

IQR vs Range (interquartile range vs range) | Statistics

We compare the interquartile range of a set of data and the range of a set of data. We'll see how the range is much more sensitive to extreme values and outliers, since it is defined as the max minus the min. The IQR, on the other hand, since it is the range of the middle half of the data,

From playlist Statistics

Related pages

Cohen's kappa | Inter-rater reliability | Krippendorff's alpha | Generalizability theory | Kendall rank correlation coefficient | Bland–Altman plot | Cronbach's alpha | Concordance correlation coefficient | Rasch model | Spearman's rank correlation coefficient | Test validity | Standard deviation | Fleiss' kappa