NSA: Validity and Reliability Criteria (2018)Criteria Used to Evaluate Validity, Reliability and Agreement of Adult Nutrition Screening Tools
Individual Validity Results
As cited in Neelemaat, 2011, cut-offs for interpretation of individual validity results [sensitivity (Se), specificity (Sp), and positive and negative predictive value (PPV; NPV)] were as follows:
|Se, Sp, PPV, NPV Value||Classification|
Reference: Neelemaat F, Meijers J, Kruizenga H, van Ballegooijen H, van Bokhorst-de van der Schueren M. Comparison of five malnutrition screening tools in one hospital inpatient sample. J Clin Nurs. 2011 Aug; 20 (15-16): 2,144-2,152. Epub 2011 Apr 28. PMID: 21535274.
Overall Validity Results
To interpret the degree of individual validity measures, based on the aggregate of all study results for each tool, cut offs were established:
|Se, Sp, PPV, NPV Cut Offs||Classification|
|90% to 100%||HIGH|
|80% to ≤89%||MODERATE|
OVERALL Degree of Validity
The sensitivity, specificity, PPV and NPV classifications were entered into the Validity Algorithm (below) to determine the OVERALL degree of validity of each tool. Sensitivity and NPV carried more weight than specificity and PPV in determining the degree of validity.
RELIABILITY AND AGREEMENT
Individual Kappa Results
Cut-offs for interpretation of individual reliability and agreement results (kappa) were as follows:
|Kappa Value||Level of Agreement|
|Above 0.90||Almost Perfect|
Reference: Adapted from Table 3, Interpretation of Cohen’s kappa, in McHugh ML. Interrater reliability: the kappa statistic. Biochem Med (Zagreb). 2012; 22 (3): 276-282. PMID: 23092060.
OVERALL Kappa Results
To interpret the degree of reliability and agreement, based on the aggregate of all study results for each tool, overall kappa cut offs were established:
|Kappa Cut Offs||Classification|
|0.80 to 1.0||HIGH|
|0.60 to ≤0.79||MODERATE|