eTable 1. Factor Loadings for Western and Prudent Diet Pattern Components According to Principal Components Analysis
eTable 2. Estimated 20-Year Change in Digit Symbol Substitution (DSS) by Tertile of Diet Pattern
eTable 3. Estimated 20-Year Change in Delayed Word Recall (DWR) by Tertile of Diet Pattern
eTable 4. 21-Year Change in Word Fluency Test (WF) by Tertile of Diet Pattern
eTable 5. Estimated 20-Year Change in Cognitive Function by Tertile of Diet Pattern Using Non-imputed Data
eTable 6. Participant Characteristics by Tertile of Prudent Diet Score
Customize your JAMA Network experience by selecting one or more topics from the list below.
Identify all potential conflicts of interest that might be relevant to your comment.
Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.
Err on the side of full disclosure.
If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.
Not all submitted comments are published. Please see our commenting policy for details.
Dearborn-Tomazos JL, Wu A, Steffen LM, et al. Association of Dietary Patterns in Midlife and Cognitive Function in Later Life in US Adults Without Dementia. JAMA Netw Open. 2019;2(12):e1916641. doi:10.1001/jamanetworkopen.2019.16641
What is the association between the Western dietary pattern in adults in midlife and cognitive decline in later life?
In this cohort study of 13 588 adults without dementia at baseline, midlife dietary pattern was not associated with cognitive decline 20 years later.
The Western dietary pattern may not contribute to cognitive decline in later life.
The association of dietary patterns, or the combinations of different foods that people eat, with cognitive change and dementia is unclear.
To examine the association of dietary patterns in midlife with cognitive function in later life in a US population without dementia.
Design, Setting, and Participants
Observational cohort study with analysis of data collected from 1987 to 2017. Analysis was completed in January to February 2019. Community-dwelling black and white men and women from Washington County, Maryland; Forsyth County, North Carolina; Jackson, Mississippi; and suburban Minneapolis, Minnesota, participating in the Atherosclerosis Risk in Communities (ARIC) study were included.
Two dietary pattern scores were derived from a 66-item food frequency questionnaire using principal component analysis. A Western, or unhealthy, dietary pattern was characterized by higher consumption of meats and fried foods. A so-called prudent, or healthier, dietary pattern was characterized by higher amounts of fruits and vegetables.
Main Outcomes and Measures
Results of 3 cognitive tests (Digit Symbol Substitution Test, Word Fluency Test, and Delayed Word Recall) performed at 3 points (1990-1992, 1996-1998, and 2011-2013) were standardized and combined to represent global cognitive function. The 20-year change in cognitive function was determined by tertile of diet pattern score using mixed-effect models. The risk of incident dementia was also determined by tertile of the diet pattern score.
A total of 13 588 participants (7588 [55.8%] women) with a mean (SD) age of 54.6 (5.7) years at baseline were included; participants in the top third of Western and prudent diet pattern scores were considered adherent to the respective diet. Cognitive scores at baseline were lower in participants with a Western diet (z score for tertile 3 [T3], −0.17 [95% CI, −0.20 to −0.14] vs T1, 0.17 [95% CI, 0.14-0.20]) and higher in participants with a prudent diet (z score for T3, −0.09 [95% CI, −0.12 to −0.06] vs T1, −0.09 [95% −0.12 to −0.06]). Estimated 20-year change in global cognitive function did not differ by dietary pattern (difference of change in z score for Western diet, T3 vs T1: −0.01 [95% CI, −0.05 to 0.04]; and difference of change in z score for prudent diet, T3 vs T1: 0.02 [95% CI, −0.02 to 0.06]). The risk of incident dementia did not differ by dietary pattern (Western hazard ratio for T3 vs T1, 1.06 [95% CI, 0.92-1.22]; prudent hazard ratio for T3 vs T1, 0.99 [95% CI, 0.88-1.12]).
Conclusions and Relevance
This study found that the dietary pattern of US adults at midlife was not associated with processing speed, word fluency, memory, or incident dementia in later life.
Healthy dietary patterns may protect against dementia and mild cognitive impairment.1,2 Prior studies demonstrate that healthy dietary patterns are associated with increased brain volumes and reduced atrophy compared with less healthy dietary patterns.3,4 Although the mechanism behind a healthy diet and improved brain health are not well understood, 2 plausible mechanisms include reduced vascular injury and a reduction in Alzheimer pathology.5 A healthy dietary pattern reduces hypertension, dysglycemia, hyperlipidemia, and chronic inflammation, which may reduce brain vascular injury.2,6 Second, a healthy diet may, through reduced oxidative stress, reduce the accumulation of proteins involved in Alzheimer disease.5,7,8
Midlife dietary pattern, compared with dietary pattern in later life, may have a stronger association with cognitive decline and dementia because chronic disease or the concern for chronic disease in later life may motivate individuals to improve their diet,9 making it appear that a healthy diet is associated with poor health outcomes. At least 10 prior studies examined associations of dietary patterns later in life with cognitive decline, but far fewer prospectively investigated associations for midlife dietary patterns.9,10 In this study, we examine the association between midlife dietary patterns and cognitive change and incident dementia over 20 years. We hypothesized that a healthy diet at midlife would be associated with less cognitive decline and a lower risk of dementia.
The Atherosclerosis Risk in Communities (ARIC) study is a randomly selected and recruited observational cohort study that began in 1987 with individuals aged 45 to 64 years who were representative of the selected communities. Participants enrolled in the ARIC study were from 4 US communities (Jackson, Mississippi; Forsyth County, North Carolina; Washington County, Maryland; and suburban Minneapolis, Minnesota). A total of 15 792 participants received an initial evaluation, and these participants were re-evaluated in person every 3 years. The study is ongoing with 6 in-person visits completed to date.11 The current analysis includes information collected at visit 1 (1987-1989), visit 2 (1990-1992), visit 4 (1996-1998), and at visit 5 (2011-2013). All patients provided written informed consent. The study was approved by the institutional review boards at all participating institutions. The analysis presented is compliant with the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) reporting guideline.
Participants completed a 66-item food frequency questionnaire at baseline (1987-1989).12 We condensed baseline questionnaire items into 20 food and beverage groups and derived 2 dietary pattern scores using principal components analysis with orthogonal rotation.13 Principal components analysis transforms possibly correlated variables (in this case, each of the 20 food groups) into a set of linearly uncorrelated variables (in this case, the 2 dietary patterns). Two distinct dietary patterns emerged with an eigenvalue greater than 2 (eTable 1 in the Supplement). The Western diet pattern explained 12% of the total variance and included higher consumption of meat, refined grains, and processed and fried foods. The so-called prudent diet pattern reflected 10% of the total variance and included higher consumption of fruits and vegetables, fish, chicken, whole grains, dairy, nuts, and alcohol. Study participants received a score for each dietary pattern. The score established how closely they adhered to a Western diet pattern or a prudent diet pattern. Scores ranged from −3.96 to 14.26 (interquartile range [IQR], −1.34 to 0.77) for the Western diet pattern and −3.56 to 10.55 (IQR, −0.98 to 0.79) for the prudent diet pattern. A higher score indicated greater adherence to each particular diet pattern.
Cognitive testing was performed at visit 2 (1990-1992), visit 4 (1996-1998), and visit 5 (2011-2013). Three tests were used in the cognitive battery: the Delayed Word Recall (DWR)14 test, the Digit Symbol Substitution (DSS) test, and the Word Fluency (WF) test.15 A test-specific z score representing cognitive function at visits 4 and 5 was calculated by subtracting the baseline population mean from the participant’s raw score and dividing the difference by the baseline population standard deviation. A global z score representing cognitive function at visit 4 and 5 was created as the mean of the 3 test-specific z scores.
Cognitive change was defined as the difference in test-specific and global z scores at each point using random-effects linear regression models to account for the intra-individual correlation of cognitive scores.16
Dementia was adjudicated according to an established protocol that included assessments involving 3 levels of ascertainment consisting of in-person assessments, telephone interviews of participants or informants, or surveillance based on hospital discharge codes and death certificates.17 Dementia was adjudicated in 2011 to 2013 and 2016 to 2017. Level 1 included in-person assessment of dementia using an algorithm that incorporated information from the Clinical Dementia Rating Interview; the Mini-Mental State examination; longitudinal cognitive testing at visits 2, 4, 5, and 6; a complete neuropsychological battery at visits 5 and 6; and the Functional Activities Questionnaire.18 Level 2 included participants from level 1 and, in addition, 3 other categorizations: (1) participants who met predefined criteria based on completion of the Telephone Interview for Cognitive Status–modified or the Six-Item Screener, (2) deceased persons classified as having dementia, and (3) informant interviews using the AD8 dementia screening interview, as described elsewhere.18 Level 3 included participants in level 2 in addition to individuals with dementia identified by surveillance using prior hospital discharge codes (International Classification of Diseases, Ninth Revision [ICD-9] or International Statistical Classification of Diseases and Related Health Problems, Tenth Revision [ICD-10]) or death certificate codes for dementia.17 Level 3 was used for this analysis. Level 3 ascertained a dementia status (yes or no) for all participants regardless of study visit completion.
Covariates included demographic and lifestyle factors, clinical factors, and apolipoprotein E (APOE) ε4 status. Demographic factors included age, sex, race–study center, and education. Lifestyle factors included activity level, current smoking, current alcohol use, and total energy intake. Clinical factors included body mass index (calculated as weight in kilograms divided by height in meters squared), history of hypertension (yes or no, defined as use of hypertension medication, systolic blood pressure >140 mm Hg, or diastolic blood pressure >90 mm Hg at the baseline visit), diabetes (yes or no, defined as self-reported diabetes diagnosis by physicians, use of diabetes medication, or having fasting glucose level of 126 mg/dL or higher or a nonfasting glucose level of 200 mg/dL or higher at the baseline visit [to convert to millimoles per liter, multiply by 0.0555]), total cholesterol (fasting, mmol/L), history of coronary artery disease, and prevalent stroke through visit 2 defined based on self-report of stroke prior to visit 1 and adjudicated cases between visit 1 and visit 2.
There was a 15-year gap between visit 4 and visit 5, leading to attrition, largely due to death or disability. At visits 4 and 5, respectively, 73.8% and 41.4% of the original cohort remained. This dropout is likely to be informative,19 as we found that diet scores were associated with loss to follow-up (Table 1). To account for population attrition, we imputed the missing cognitive test results in visits 4 and 5 and missing baseline covariates using multiple imputations with chained equations.20 In the multiple imputations with chained equations, we incorporated the diet scores, all covariates, prior cognitive function measurements, and ancillary information about cognitive status collected prospectively for participants who did not attend visit 5. The ancillary cognitive information was collected from the Clinical Dementia Rating scale from informants of both living participants and deceased participants, the Telephone Interview for Cognitive Status for living participants, and hospitalization discharge codes and death certificates (ICD-9 codes). We imputed the cognitive function for participants who died before visit 5 six months prior to the date of death.21 We conducted the primary analyses with 25 sets of imputations.
In the primary analysis, we evaluated the association of the 2 dietary pattern scores by tertile with cognitive function as measured by the change in global z scores at visits 2, 4, and 5. Mixed-effect models were used to account for the correlation between repeated cognitive test measures over time. We defined a study time metric from visit 2 to the cognitive measurement. We used a linear spline for the time variable with a knot at visit 4 to address potential nonlinearity of cognitive change. We incorporated 2 random slopes, which corresponded to the 2 time-spline terms, and a random intercept, assuming an independent correlation structure. To measure the association between diet scores and cognitive change, we examined the interactions between exposure strata and the time-spline terms modeled as conditional likelihoods. We also included conditional likelihoods for age, sex, education, race–field center, and total energy intake for model 1 and APOE ε4 status, alcohol use history, smoking history, activity level, body mass index, total cholesterol, prevalent coronary heart disease, history of hypertension, diabetes, and stroke for model 2. We included the conditional likelihoods for interactions between time-splines and covariates that contributed to the slope of cognitive change for aforementioned covariates. We estimated the mean cognitive change over 20 years by dietary score tertile, using the coefficients of the 2 time-splines terms and their interactions with diet score tertile. A linear trend was tested across the dietary tertiles using the median score of each tertile modeled as a continuous variable.
We performed 2 secondary analyses. In the first, we further evaluated the association of diet scores with the z score from each of the 3 individual cognitive test results (DSS, DWR, and WF). In the second, we replicated the analyses using nonimputed data. In both analyses, we applied the same methods as in the primary analysis.
We next evaluated the association of the 2 dietary pattern scores with incident dementia using Cox proportional hazard models. We adjusted for the baseline covariates age, sex, education, race–field center, and total energy intake for model 1 and APOE ε4 status, alcohol use history, smoking history, activity level, body mass index, total cholesterol, prevalent coronary heart disease, history of hypertension, diabetes, and stroke for model 2.
The analysis for this study was completed in January to February 2019. Baseline characteristics of participants were compared by tertiles of diet score using χ2 or analysis of variance. Two-sided P < .05 was considered statistically significant. Analyses were conducted using Stata statistical software version 14.2 (StataCorp).
A total of 15 792 adults enrolled at study baseline (1987-1989) when they were aged 45 to 64 years. Of these 15 792 participants, 6538 joined the neurocognitive visit from 2011 to 2013. Because of small numbers, and in accordance with usual ARIC practice, we excluded those who were neither white nor black (48 individuals) and the black participants in the Minnesota (22 participants) and Washington County cohorts (33 participants). We further excluded 121 participants for missing dietary data and 387 participants with implausible caloric intake, defined as less than 500 and more than 3500 total calories per day for women and less than 700 and more than 4500 total calories per day for men. We also excluded 1550 participants with incomplete cognitive data at visit 2, among whom 1348 missed visit 2, and 43 participants who were missing key covariates. The analytic population included 13 588 participants.
At the baseline visit, participants in our study had a mean (SD) age of 54.6 (5.7) years, 55.8% were women, and 37.2% had at least some college education or greater (Table 1). The average participant was overweight (mean [SD] body mass index, 27.6 [5.3]) and consumed a mean (SD) of 1629 (605) kcal/d. In all, 57.9% of participants did not complete cognitive testing at visit 5, and 28.6% of participants died between study baseline and visit 5.
Adherence to the Western diet pattern was defined as participants reaching the third tertile of the Western diet pattern score. Participants with a Western diet pattern had a higher rate of study attrition (Table 1) and were less likely to be women. Participants with a Western diet pattern were more likely to be from Washington County, Maryland, or Jackson, Mississippi, compared with the other 2 sites, more likely to have less than high school education, more likely to be current smokers, and less likely to engage in physical activity. Participants with a Western diet pattern were also more likely to consume a greater number of calories but were not more likely to have hypertension and diabetes.
Cognitive scores at first measurement were lower in participants with a Western diet pattern compared with participants in the first tertile of Western diet pattern score (z score for tertile 3 [T3], −0.17 [95% CI, −0.20 to −0.14] vs T1, 0.17 [95% CI, 0.14-0.20]) (Table 2). The finding of lower cognitive scores in participants with a Western diet pattern was consistent after adjustments for demographic factors and caloric intake (model 1), but was not statistically significant after full adjustments for lifestyle and clinical factors (model 2) (Table 2). Twenty-year change in cognitive scores was less in participants with a Western diet pattern compared with participants in the first tertile of Western diet pattern score; however, this association did not remain after full adjustments (difference of change in z score for Western diet, T3 vs T1: −0.01 [95% CI, −0.05 to 0.04]) (Table 3). When examined independently, only 20-year change in the DSS z score was less in participants with a Western diet pattern compared with participants in the first tertile of Western diet pattern score (meaning less decline), but this association was not significant after adjustments (eTable 2 in the Supplement). Twenty-year change in the DWR or WF was not different in participants with a Western pattern compared with participants in the first tertile of Western diet pattern score (eTable 3 and eTable 4 in the Supplement).
Our secondary analysis using nonimputed data demonstrated the same findings for the Western diet pattern compared with the imputed data set (eTable 5 in the Supplement).
Participants with a Western diet pattern were no more likely to develop dementia 20 years later compared with participants in the first tertile of Western diet pattern score (adjusted hazard ratio for T3 vs T1, 1.06; 95% CI, 0.92-1.22 for T3 vs T1) (Table 4).
Adherence to a prudent diet pattern was defined as participants reaching the third tertile of the prudent diet pattern score (z score for T3, −0.09 [95% CI, −0.12 to −0.06] vs T1, −0.09 [95% −0.12 to −0.06]). Participants with a prudent diet pattern had no difference in study attrition (eTable 6 in the Supplement) and were more likely to be women. Participants with a prudent diet pattern were less likely to be from Jackson, Mississippi, and more likely to be from the other 3 study locations. Participants with a prudent diet pattern were more likely to have a college education or greater and were more likely to be never smokers and engage in physical activity. Participants with a prudent diet pattern were also more likely to consume higher calories and have diabetes but not hypertension.
Cognitive scores at first measurement were higher in participants with a prudent diet pattern compared with participants in the first tertile of prudent diet pattern score (Table 2), but this association did not remain after full adjustments. Twenty-year change in cognitive scores was greater in participants with a prudent diet pattern compared with participants in the first tertile of prudent diet pattern score; however, this association did not remain after full adjustments (difference of change in z score for prudent diet T3 vs T1: 0.02 [95% CI, −0.02 to 0.06]) (Table 3). When examined independently, only 20-year change in the DSS was greater in participants with a prudent diet pattern compared with participants in the first tertile of prudent diet pattern score, but this association was not significant after full adjustments (eTable 2 in the Supplement). Twenty-year change in the DWR or WF was not different in participants with a prudent diet pattern compared with participants in the first tertile of prudent diet pattern score (eTable 3 and 4 in the Supplement).
Our secondary analysis using nonimputed data demonstrated the same findings for the prudent pattern compared with the imputed data set (eTable 5 in the Supplement).
Participants with a prudent diet pattern were not more likely to develop dementia 20 years later compared with participants in the first tertile of prudent diet pattern score (adjusted hazard ratio, 0.99; 95% CI, 0.88-1.12 for T3 vs T1) (Table 4).
We did not find an association between dietary patterns and cognitive decline measured over 20 years. A dietary pattern high in meat and fried food intake was associated with lower cognitive test scores at baseline, but differences in demographic characteristics and health behaviors explained this finding. Similarly, a dietary pattern high in fruit and vegetable intake was associated with higher cognitive test scores at baseline, but differences in demographic characteristics and health behaviors explained this finding.
Our results stand in contrast to short-term observational studies. Several observational studies,22-26 ranging in duration from 5 to 7 years, showed modest associations between dietary patterns and cognitive health. One study24 followed 1410 participants over 5 years and found that adherence to a Mediterranean-type dietary pattern was associated with less decline in the Mini-Mental State examination. Another study23 followed more than 2200 participants for 6 years and found that the Western diet was associated with greater cognitive decline and the prudent diet was associated with less cognitive decline as measured by the Mini-Mental State examination.
A recent long-term observational study27 aligns with our results. The Whitehall II study27 measured diet in 1991 to 1993 and dementia surveillance occurred through 2017. The authors found that diet quality at midlife was not associated with incident dementia in long-term follow up. Our results confirm the findings of this study in a US population.
We suggest 3 explanations for the reported differences between short-term studies and studies with long-term follow-up.27 First, it may be that over time, other chronic diseases such as diabetes have a greater impact on cognition compared with diet. Our study only partially accounts for this confounding by adjusting for comorbidities at baseline. Second, participants with an unhealthy diet engage in multiple unhealthy behaviors (eg, smoking and lack of physical activity). It may be difficult to elucidate the independent outcomes associated with diet when multiple lifestyle behaviors contribute to cognitive function. Third, our study does not account for change in dietary intake or the food supply over 20 years.
Two clinical trials28 build on the promising observational science to examine whether dietary changes can protect against cognitive decline and dementia. One intervention28 tested a Mediterranean diet with olive oil or nuts as supplementation in 334 participants at high cardiovascular risk and found improved composite cognitive function compared with the control diet. A second clinical trial, the Mediterranean-Dietary Approaches to Stop Hypertension (MIND) clinical trial, is currently under way. While our study did not find an association of diet with cognitive decline, this should not undermine the potential of dietary change to affect brain health.
Our study has strengths, one of which is the long duration of follow up. Another is our ability to account for study dropout due to death or loss to follow-up using criterion-standard imputation methods.20
There are also several limitations of our study. First, our definition of achievement of a Western or prudent diet score is based on our tertile cutoffs and may not reflect individual participant identification with the specified dietary pattern. Second, diet was measured 3 years before the first cognitive measurement. The nonconcurrent measurements are unlikely to affect the results because dietary patterns remain relatively stable up to 7 years.29 However, dietary intake likely changes over 20 years owing to change in the food supply and food habits. The ARIC study did not capture diet over the 20 years to test this possibility. In addition, as participants with an unhealthy diet had lower cognition at the time of first assessment, it is possible that diet exerted influence prior to our time of measurement. As diet was not associated with either cognitive trajectories or incident dementia, this is less likely to be the case. We should also note that although study dropout was accounted for, a large proportion of participants did not follow up after 20 years. Finally, as in all observational studies, we are unable to attribute causality to our observations, as the mechanisms between diet and brain health are complex, and the only way to definitively measure the relationship between dietary practices and cognition is in an experimental design in which diet is manipulated; however, long-term follow-up may be expensive.
The results of this cohort study do not support the hypothesis that midlife diet significantly contributes to cognitive decline independent of demographic and behavioral factors. Our finding that participants with an unhealthy diet have lower cognitive function could be attributed to cigarette smoking, eating excess calories, or engaging in less physical activity. Our results suggest that it may be important to address all modifiable risk factors in dietary interventions, supporting the emerging body of multimodal lifestyle and behavioral research.30 A multimodal approach may provide greater risk reduction for cognitive aging.
Accepted for Publication: October 13, 2019.
Published: December 4, 2019. doi:10.1001/jamanetworkopen.2019.16641
Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2019 Dearborn-Tomazos JL et al. JAMA Network Open.
Corresponding Author: Jennifer L. Dearborn-Tomazos, MD, Department of Neurology, Beth Israel Deaconess Medical Center, Harvard Medical School, 330 Brookline Ave, Boston, MA 02215 (firstname.lastname@example.org).
Author Contributions: Dr Wu had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Concept and design: Dearborn-Tomazos, Wu, Anderson.
Acquisition, analysis, or interpretation of data: Dearborn-Tomazos, Wu, Steffen, Hu, Knopman, Mosley, Gottesman.
Drafting of the manuscript: Dearborn-Tomazos, Wu.
Critical revision of the manuscript for important intellectual content: Wu, Steffen, Anderson, Hu, Knopman, Mosley, Gottesman.
Statistical analysis: Wu.
Obtained funding: Mosley.
Conflict of Interest Disclosures: Dr Dearborn-Tomazos reported receiving grants from Jason Pharmaceuticals outside the submitted work. Dr Knopman reported receiving personal fees from the DIAN TU study and serving on the study’s data safety monitoring board; receiving grants from Biogen and Eli Lilly; serving as an investigator in clinical trials sponsored by Biogen, Eli Lilly, and the University of Southern California; and receiving research support from the National Institutes of Health (NIH) National Institute on Aging outside the submitted work. Dr Mosley reported receiving grants from the NIH during the conduct of the study; and grants from the NIH outside the submitted work. Dr Gottesman reported serving as an associate editor of the journal Neurology outside the submitted work. No other disclosures were reported.
Funding/Support: The Atherosclerosis Risk in Communities study is funded by the National Heart, Lung, and Blood Institute; the NIH; and the Department of Health and Human Services (contracts HHSN268201700001I, HHSN268201700002I, HHSN268201700003I, HHSN268201700005I, and HHSN268201700004I).
Role of the Funder/Sponsor: The funders had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.
Additional Contributions: We thank the staff and participants of the Atherosclerosis Risk in Communities Study for their important contributions.