Iron deficiency, with its consequent effects on anemia, immune function, cognitive development, and physical capacity, is estimated to be one of the most prevalent nutritional problems worldwide.1 Yet iron is the second most abundant element on the earth’s crust. How can this be? The answer is an intriguing one that has important implications for the understanding of iron biology and may point toward safer ways of administering iron. In brief, the useful oxidoreductive characteristics of iron have made it the element of choice in many biochemical pathways for both the human host and its legions of pathogens, leading to a highly evolved metabolic competition for this element.2