Inter-Rater Reliability Essentials: Practical Guide in R

Inter-Rater Reliability Essentials: Practical Guide in R

9.50

This ebook presents the essentials of inter-rater reliability analyses in R.

Key features:

  • Covers the most common statistical measures for the inter-rater reliability analyses, including cohen’s Kappa, weighted kappa, Light’s kappa , Fleiss kappa, intraclass correlation coefficient and agreement chart.
  • Key assumptions are presented
  • Short, self-contained chapters with practical examples.

This is an ebook. Buy now and you will receive a link to download a PDF copy (click to see the book preview)

Description

This book provides a solid step-by-step practical guide to inter-rater reliability analyses using R software. The inter-rater reliability are statistical measures, which give the extent of agreement among two or more raters (i.e., "judges", "observers"). Other synonyms are: inter-rater agreementinter-observer agreement or  inter-rater concordance.

This book is designed to get you doing the analyses as quick as possible. It focuses on implementation and understanding of the methods, without having to struggle through pages of mathematical proofs.

You will be guided through the steps of basic explanations of the test formula and assumptions, performing the analysis in R, interpreting and reporting the results.

Key features

  • Covers the most common statistical measures for the inter-rater reliability analyses, including cohen’s Kappa, weighted kappa, Light’s kappa , Fleiss kappa, intraclass correlation coefficient and agreement chart.
  • Key assumptions are presented
  • Short, self-contained chapters with practical examples.

Version: Français

Reviews

There are no reviews yet.

Be the first to review “Inter-Rater Reliability Essentials: Practical Guide in R”

Your email address will not be published. Required fields are marked *