Spelling suggestions: "subject:"aninvariant methods"" "subject:"noninvariant methods""
1 |
Kappa — A Critical ReviewXier, Li January 2010 (has links)
<p>The Kappa coefficient is widely used in assessing categorical agreement between two raters or two methods. It can also be extended to more than two raters (methods). When using Kappa, the shortcomings of this coefficient should be not neglected. Bias and prevalence effects lead to paradoxes of Kappa. These problems can be avoided by using some other indexes together, but the solutions of the Kappa problems are not satisfactory. This paper gives a critical survey concerning the Kappa coefficient and gives a real life example. A useful alternative statistical approach, the Rank-invariant method is also introduced, and applied to analyze the disagreement between two raters.</p>
|
2 |
Kappa — A Critical ReviewXier, Li January 2010 (has links)
The Kappa coefficient is widely used in assessing categorical agreement between two raters or two methods. It can also be extended to more than two raters (methods). When using Kappa, the shortcomings of this coefficient should be not neglected. Bias and prevalence effects lead to paradoxes of Kappa. These problems can be avoided by using some other indexes together, but the solutions of the Kappa problems are not satisfactory. This paper gives a critical survey concerning the Kappa coefficient and gives a real life example. A useful alternative statistical approach, the Rank-invariant method is also introduced, and applied to analyze the disagreement between two raters.
|
Page generated in 0.0742 seconds