[Skip to Content]
[Skip to Content Landing]
June 16, 1999

Choosing the Best Strategy to Prevent Childhood Iron Deficiency

Author Affiliations

Author Affiliation: Division of Hematology, The Children's Hospital of Philadelphia and the Department of Pediatrics, University of Pennsylvania School of Medicine, Philadelphia.

JAMA. 1999;281(23):2247-2248. doi:10.1001/jama.281.23.2247

Twelve years ago, in the pages of THE JOURNAL, Stockman1 celebrated a series of victorious battles against iron deficiency, but he warned that the war was far from over. He had good reason for both joy and caution. Following the introduction in 1972 of the Special Supplemental Food Program for Women, Infants and Children, iron deficiency anemia virtually disappeared in some populations of high-risk children. For example, the prevalence of moderate or severe anemia (hemoglobin levels <98 g/L) in young children in the inner city of New Haven decreased from 23% in 1971 to 1% in 1984.2 From 1975 to 1985, the prevalence of anemia in young children enrolled in state public health programs monitored by the Centers for Disease Control Pediatric Nutrition Surveillance System declined from 7.8% to 2.9%.3 Interventions other than the program for women, infants, and children, such as an increased emphasis on breast-feeding, delayed introduction of cow's milk, and improved iron fortification of infant foods, contributed to the better iron status of children in low-income families, and produced similar gains among middle-class children. In a private pediatric practice in Minneapolis, the rate of anemia decreased from 6.2% in 1969-1973 to 2.7% in 1982-1986.4 Government agencies, policymakers, physicians, and the food industry could point with pride to the success of their efforts.

First Page Preview View Large
First page PDF preview
First page PDF preview