Morris et al1 found greater cognitive decline in older persons exposed to greater levels of folic acid intake and wondered about its mechanism. Another article of the same group and same database2 indicated less sensitivity to cognitive changes when participants had low baseline cognitive scores and fewer than 12 years of education. The authors should provide a sufficiently clear explanation of the standardized units of cognitive scores: its explicit formula would suffice. As outlined in Table 1,1 the reference group (the lowest quintile of folic acid intake) had a mean amount of fewer than 12 years of formal education and a mean baseline cognitive score that measured only 20% and 39% of the 2 highest quintiles of folic intake groups. It was not surprising that lowering an already low-scoring cognition in a nondemented group should be hardly detected because of its low sensitivity to slight cognitive changes; at the same time, the highest quintile groups, which had much higher baseline cognition scores, would be expected to be comparatively much more sensitive to small cognitive changes. We are therefore convinced that the conclusion from Morris and colleagues regarding a putative folic acid risk for enhanced cognitive decline is the result of a severe bias from the relative insensitivity to cognitive change of their reference group, inducing the authors to wrongly accept the idea that the 2 highest quintile groups changed more than did the lowest quintile folic acid intake group because of the folic acid intake itself. Findings such as those of Wang et al3 represent such a strong empirical background in the field that this inconsistency should not be undervalued.
Fridman S. High Folic Acid Intake Is Not a Risk Factor for Cognitive Decline: Misinterpretation of Results. Arch Neurol. 2005;62(11):1786–1787. doi:10.1001/archneur.62.11.1786-a
Artificial Intelligence Resource Center
Customize your JAMA Network experience by selecting one or more topics from the list below.