You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Title: Inter-rater Reliability Kappa Score for Non-Aligned Time Windows
Version: 0.0.0.9000
Authors@R:
person("Eliot", "Bethke", , "bethke2@illinois.edu", role = c("aut", "cre"),
comment = c(ORCID = "0000-0002-3998-5199"))
Description: Kappa statistics for multiple rater coding where the unit of analysis is not pre-defined and codes can overlap. Each rater is considered as a reference, and every other rater is compared against the reference for agreement. Options are included to weight raters or rate weight types of agreement.