Cohen's kappa

Cohen's kappa

statistic measuring inter-rater agreement for categorical items
more_vert
pencil

Cohen's kappa coefficient (κ, lowercase Greek kappa) is a statistic that is used to measure inter-rater reliability (and also intra-rater reliability) for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation, as κ takes into account the possibility of the agreement occurring by chance. There is controversy surrounding Cohen's kappa due to the difficulty in interpreting indices of agreement. Some researchers have suggested that it is conceptually simpler to evaluate disagreement between items.

Contributors

No records found.

This page is the FamousFix profile for Cohen's kappa. Content on this page is contributed by editors who belong to our editorial community. We welcome your contributions... so please create an account if you would like to collaborate with other editor's in helping to shape this website.

On the Cohen's kappa page you will be able to add and update factual information, post media and connect this topic to other topics on the website. This website does skew towards famous actors, musicians, models and sports stars, however we would like to expand that to include many other interesting topics.

Terms of Use · Copyright · Privacy
Copyright 2006-2025, FamousFix · 0.06s