[Skip to Content]
[Skip to Content Landing]
September 6, 1985

Improved Detection of Early Iron Deficiency

JAMA. 1985;254(9):1174. doi:10.1001/jama.1985.03360090064015

To the Editor.—  The article by McClure et al1 presents provocative data regarding the use of red blood cell distribution width as a screening tool for early iron deficiency. In their work, the authors present values representing the specificity and sensitivity of their screening test for early iron deficiency.Specificity is defined as the number of nondiseased individuals with a negative test result divided by the total number of nondiseased individuals. Applying this definition to McClure and co-workers' data, the true ratio is 90 to 115, or 78%. The ratio 48 to 73, which they incorrectly label as the specificity, is actually the positive predictive value of their screening tool. The positive predictive value is the likelihood that a disease is present in a patient with a positive test result.In an article entitled "The Assessment of Diagnostic Tests" by Sheps and Schechter2 published in JAMA Nov 2,1984,