Curhan GC, Willett WC, Knight EL, Stampfer MJ. Dietary Factors and the Risk of Incident Kidney Stones in Younger WomenNurses' Health Study II. Arch Intern Med. 2004;164(8):885-891. doi:10.1001/archinte.164.8.885
Copyright 2004 American Medical Association. All Rights Reserved. Applicable FARS/DFARS Restrictions Apply to Government Use.2004
In older women and men, greater intakes of dietary calcium, potassium, and total fluid reduce the risk of kidney stone formation, while supplemental calcium, sodium, animal protein, and sucrose may increase the risk. Recently, phytate has been suggested to play a role in stone formation. To our knowledge, no prospective information on the role of dietary factors and risk of kidney stone formation is available in younger women.
We prospectively examined, during an 8-year period, the association between dietary factors and the risk of incident symptomatic kidney stones among 96 245 female participants in the Nurses' Health Study II; the participants were aged 27 to 44 years and had no history of kidney stones. Self-administered food frequency questionnaires were used to assess diet in 1991 and 1995. The main outcome measure was an incident symptomatic kidney stone. Cox proportional hazards regression models were used to adjust simultaneously for various risk factors.
We documented 1223 incident symptomatic kidney stones during 685 973 person-years of follow-up. After adjusting for relevant risk factors, a higher dietary calcium intake was associated with a reduced risk of kidney stones (P = .007 for trend). The multivariate relative risk among women in the highest quintile of intake of dietary calcium compared with women in the lowest quintile was 0.73 (95% confidence interval, 0.59-0.90). Supplemental calcium intake was not associated with risk of stone formation. Phytate intake was associated with a reduced risk of stone formation. Compared with women in the lowest quintile of phytate intake, the relative risk for those in the highest quintile was 0.63 (95% confidence interval, 0.51-0.78). Other dietary factors showed the following relative risks (95% confidence intervals) among women in the highest quintile of intake compared with those in the lowest quintile: animal protein, 0.84 (0.68-1.04); fluid, 0.68 (0.56-0.83); and sucrose, 1.31 (1.07-1.60). The intakes of sodium, potassium, and magnesium were not independently associated with risk after adjusting for other dietary factors.
A higher intake of dietary calcium decreases the risk of kidney stone formation in younger women, but supplemental calcium is not associated with risk. This study also suggests that some dietary risk factors may differ by age and sex. Finally, dietary phytate may be a new, important, and safe addition to our options for stone prevention.
Dietary factors play an important role in kidney stone formation.1- 3 In older women and men, greater intakes of dietary calcium, potassium, alcohol, and total fluid are associated with a reduced risk of stone formation, while supplemental calcium, sodium, animal protein, and sucrose may be associated with an increased risk.1,2 To our knowledge, no prospective information has been published on the role of dietary factors and risk of kidney stone formation in younger women.
Recently, dietary phytate has been suggested to play a role in stone formation.4,5 Phytate (myoinositol hexaphosphate) binds tightly to doubly charged cations such as calcium. Binding of calcium in the gastrointestinal tract may increase the absorption of dietary oxalate and thereby increase the risk of calcium oxalate stone formation.6 However, phytate also is a strong inhibitor of calcium oxalate crystal formation in vitro.5 Recent evidence4,7 suggests that ingested dietary phytate is absorbed and excreted in the urine. Thus, a higher intake of dietary phytate could reduce the risk of stone formation.
Because the effects of dietary factors may vary with age, results of studies from older women may not be generalizable to younger women. To examine associations between dietary factors and the risk of incident kidney stone formation among younger women, we conducted an 8-year prospective analysis among 96 245 female participants in the Nurses' Health Study (NHS) II who had no history of kidney stones.
In 1989, 116 671 female registered nurses from 15 states, aged 25 to 42 years, completed and returned the initial questionnaire. These women constitute the NHS II. The cohort is followed up using biennial mailed questionnaires that inquire about lifestyle practices, other exposures of interest, and newly diagnosed disease. The average follow-up for the cohort exceeds 90%.
Dietary information was first collected in 1991; hence, the start time for the present study was 1991. The analysis was limited to those women who had completed at least 1 dietary questionnaire. We excluded women for whom the date of diagnosis of a reported stone could not be confirmed or for whom the diagnosis occurred before 1991. In addition, we excluded women with asymptomatic stones that were detected during the evaluation of another condition.
In 1991 and 1995, participants completed semiquantitative food frequency questionnaires that ascertained the average intake of specified foods and beverages during the past year. Nutrient intakes were determined based on the reported frequency of consumption of each specified unit of food or beverage and from published data on nutrient content of the specified portions.8 Information was also collected on the amount of supplemental calcium (such as calcium carbonate) ingested, either as separate supplements or as part of multivitamin preparations. The reproducibility and validity of the questionnaires completed by women in a similar cohort (NHS I) have been documented,8,9 and a similar questionnaire has been shown to be valid and reproducible in men.10
Nutrient values were adjusted for total caloric (energy) intake by taking the residuals of a linear regression model with total caloric intake as the independent variable and absolute nutrient intake as the dependent variable.8,11 Calorie-adjusted values reflect the nutrient composition of the diet independent of the quantity of food consumed. In addition, adjustment for calories reduces variation introduced by questionnaire responses that underreported or overreported intake, thereby improving the accuracy of nutrient measurements.8,11
Information on age, weight, and height was obtained on the baseline questionnaire, and age and weight were updated every 2 years. Body mass index was calculated as weight in kilograms divided by the square of height in meters. Family history of kidney stones in a parent or sibling was reported on the 1997 questionnaire.
Participants who reported the diagnosis of a kidney stone in 1991 (when dietary information was first collected) or later were mailed a supplementary questionnaire to confirm the diagnosis and to ascertain the date of occurrence, the type of symptoms, other relevant medical conditions, and, if known, the stone type. A validation study of self-reported diagnosis in similar cohorts (NHS I2 and Health Professionals Follow-up Study1) found that medical records confirmed the self-report in more than 97% of the cases.
Only cases of kidney stones that were diagnosed during the 8 years between the date on which the 1991 questionnaire was returned and May 31, 1999, were considered. After the exclusion of women for whom the date of the kidney stone fell outside the study period or could not be confirmed, 96 245 women with no history of kidney stones remained in the study group.
The study design was prospective, with information on diet collected before the onset of kidney stone symptoms. For each participant, person-months of follow-up were counted from the date on which the 1991 questionnaire was returned until the date a kidney stone was diagnosed, death, or May 31, 1999, whichever occurred first. Information on exposures of interest collected on the 1991 questionnaire was updated using responses to the 1995 questionnaire. We allocated person-months of follow-up according to exposure status at the start of each follow-up period (eg, quintile of dietary phytate intake). The division of the cohort into quintiles of nutrient intake allowed us to examine a wide range of nutrient intakes while maintaining enough participants in the highest and lowest categories. If complete information on diet was missing at the start of a time period, the participant was excluded for that period.
The relative risk (the incidence among women in a particular category of intake divided by the corresponding rate in the comparison group) was used as the measure of association.12 Age-adjusted relative risks were calculated after the participants were stratified according to 5-year age categories. The Mantel extension test was used to evaluate linear trends across categories of intake.13 We used a Cox proportional hazards regression model to adjust simultaneously for several risk factors.14 Variables potentially related to stone formation that were considered in the models were age, body mass index (6 categories), alcohol intake (7 categories), vitamin B6 intake (5 categories), vitamin C intake (5 categories), intake of supplemental calcium (0, 1-100, 101-500, and >500 mg/d), and dietary intakes of calcium, animal protein, potassium, sodium, sucrose, magnesium, phosphorus, phytate, and fluid (quintile groups). We calculated 95% confidence intervals for all relative risks. All P values are 2-tailed.
We documented 1223 incident symptomatic kidney stones during 685 973 person-years of follow-up. The frequencies of self-reported characteristics from the supplementary questionnaire are shown in Table 1. Sixty-two (5.1%) of the women reported a systemic condition potentially related to stone formation. A urinary tract infection was reported present at the time of the stone event by 17.5% of the women; however, the stone was believed by the individual to be related to the infection in only 6.6% of the cases. A family history of kidney stones was reported by 36.4% of the women with stones. Pain was reported by 95.2% as the presenting symptom. Of the 439 women who reported information on stone type, 87.5% reported a calcium-containing stone.
The overall incidence of symptomatic kidney stones for the cohort was 178 cases per 100 000 person-years. The incidence was highest for those aged 27 to 34 years, was lower for those aged 35 to 44 years, and then increased again in those 45 years and older (Table 2).
Characteristics of participants according to dietary calcium quintiles are presented in Table 3. We used the 1991 dietary data to present representative values for boundaries and medians. For our analyses, the updated dietary values were used for the respective time periods. The mean daily intake of animal protein; sodium; potassium; magnesium; phosphorus; vitamins B6, C, and D; and fluid increased with increasing intake of dietary calcium. The mean daily intake of sucrose and alcohol decreased with increasing intake of dietary calcium. The mean daily intake of supplemental calcium and phytate was similar across the quintiles of dietary calcium intake.
A higher dietary calcium intake was strongly associated with a reduced risk of kidney stones after adjusting for age (P<.001 for trend) (Table 4). The age-adjusted relative risk among women in the highest quintile of dietary calcium intake compared with women in the lowest quintile was 0.54. After adjusting for age, body mass index, and intake of supplemental calcium, animal protein, sodium, potassium, sucrose, phytate, and total volume, the inverse association with dietary calcium was slightly attenuated but remained highly significant (P = .007 for trend). The multivariate relative risk among women in the highest quintile of intake of dietary calcium compared with women in the lowest quintile was 0.73. The results were essentially unchanged after further adjustment for total vitamin D intake. Similar results were found in a multivariate analysis that excluded the 62 women who reported a systemic disease that predisposes to stone formation and the 81 women who reported a urinary tract infection as the cause of stone formation.
The relation between the intake of supplemental calcium and the risk of kidney stones was examined as well. In contrast to intake of dietary calcium, we found that after adjusting for age and other potential confounders, intake of supplemental calcium was not significantly associated with risk of stone formation (P = .60 for trend) (Table 5). The relative risk among women who consumed 501 mg/d or more of supplemental calcium compared with women who did not take supplements was 1.13.
The multivariate results for other dietary factors are shown in Table 4. Animal protein was marginally associated with decreased risk of stone formation (P = .05 for trend). Phytate and total fluid intakes were significantly related to reduced risk of stone formation (P<.001 for trend). Compared with women in the lowest quintile of intake, the multivariate relative risks of stone formation in the highest quintiles were 0.84 for animal protein, 0.63 for phytate, and 0.68 for total fluid.
Sucrose intake was associated with an increased risk of stone formation (P = .01 for trend) (Table 4). Compared with women in the lowest quintile of sucrose intake, the multivariate relative risk for the highest quintile was 1.31. The intakes of sodium, potassium, magnesium, and phosphorus were not independently associated with risk after adjusting for other dietary factors (data not shown).
These findings support an important influence of dietary factors on the risk of kidney stone formation in younger women. In particular, greater consumption of dietary calcium decreases the risk of incident kidney stones. These results are consistent with the findings previously reported in studies of older women2 and men.1,3 Dietary calcium may act by binding dietary oxalate in the gut, leading to reduced oxalate absorption and urinary oxalate excretion.6,15,16 Alternatively, dairy products may be a source of some other, as yet unidentified, protective factor.
Calcium from supplements was associated with a slight and nonsignificant increase in risk; a small but significant increase in risk was observed in a study2 of older women. In earlier studies,1,2 most individuals took their supplement without food or only with breakfast; thus, the supplemental calcium was not being consumed near the time of consumption of dietary oxalate. This would lead to increased calcium absorption and urinary excretion, and would have little or no impact on the absorption and excretion of oxalate. Thus, the apparent discrepancy between the effects of dietary and supplemental calcium suggests that the timing of ingestion may be important.
One other prospective observational study has examined the association between calcium intake and risk of stone formation. A study17 of 27 001 male Finnish smokers, aged 50 to 69 years, who participated in the Alpha-Tocopherol, Beta-Carotene Cancer Prevention Study found no association with dietary calcium intake. However, the median calcium intake in their referent group was 860 mg/d, which was substantially higher than the intake in the referent group in our studies. In our study, the largest reduction in risk occurred between the first and second quintiles of dietary calcium (≤626 and 627-763 mg/d, respectively). No increase in risk was observed even among men in the highest quartile of calcium intake (median, 1790 mg/d).17
Recently, Borghi and colleagues3 reported the results of a randomized controlled dietary intervention trial. They studied 120 male first-time calcium oxalate kidney stone formers with an elevated urine calcium level who were randomized to either a low-calcium (approximately 400-mg/d) diet or a "normal" calcium (approximately 1200-mg/d), low–animal protein, and low-sodium diet. During the 5-year study, men assigned to the latter diet had a 50% lower rate of first recurrence and also clinically and statistically significant reductions in urinary calcium and oxalate levels. Thus, there seems to be no justification for the recommendation of low-calcium diets for individuals with calcium-containing kidney stones.18,19
In the present study, we observed a marginally significant inverse association between animal protein intake and risk of stone formation. This result conflicts with our a priori hypothesis and with previous findings in women2 and men.1 The ranges of animal protein intake in the present study (≤51 g/d in the lowest quintile and ≥78 g/d in the highest quintile) were quite similar to those in the other studies. Interestingly, the Finnish researchers17 also found a reduced risk for animal protein; however, they did not control for all the potentially relevant dietary confounders. In a dietary intervention study20 of 99 calcium oxalate stone formers, an increased risk of stone formation was observed in the intervention group assigned to a diet that included a reduced animal protein intake. In the study by Borghi et al,3 the independent effect of protein reduction could not be assessed. Previous physiologic studies21,22 have predicted an increased risk with higher animal protein intake, based on changes in urine chemistry results (ie, increased calcium and uric acid levels and decreased citrate level), but these studies have not focused on younger women. Evidently, the role of animal protein merits further study.
We observed a strong inverse association between phytate intake and risk of stone formation; women in the highest quintile of phytate intake had a 36% lower risk. Phytate is the most abundant form of phosphate in plants. Phytate forms insoluble complexes with calcium in the gastrointestinal tract and reduces calcium absorption and urinary calcium excretion, which consequently could reduce the risk of stone formation. However, this same action could result in increased oxalate absorption and urinary oxalate excretion, which would increase the risk. In vitro, phytic acid inhibits heterogeneous nucleation of calcium oxalate crystals, which would reduce the risk of stone formation. In a rat model of ethylene glycol–induced calcium oxalate nephrolithiasis, oral phytic acid reduced the number of calcifications on the papillary tips.5 This suggests that phytate acts by inhibiting crystal formation in the urine. There are limited data on the quantity of phytic acid or its metabolites excreted in the urine. In a case-control study,4 urinary phytate levels were 40% lower in active calcium oxalate stone formers compared with healthy control subjects (P<.05). In healthy individuals, urinary phytate levels decreased by more than 50% after 36 hours of a phytate-free diet (P<.05),4 but they may be normalized with phytate supplements.23 In our female cohort, the most common foods that contributed to phytate intake were cold cereal, dark bread, and beans. Apparently, phytate is absorbed from the diet and excreted in the urine and may, thus, be a modifiable dietary factor that could decrease the likelihood of stone recurrence.
Sucrose intake was associated with an increased risk that is consistent with the findings in an older female cohort.2 A high sucrose intake increases urinary calcium excretion independent of calcium intake.24 The mechanism by which this occurs is unknown.
Sodium intake was not associated with risk of stone formation. These findings are consistent with observations in men,1 but differ from the increased risk found in women.2 It is possible that sodium intake is not important in this age group. Another possibility is that the assessment of sodium intake was not sufficiently accurate to detect an association. The independent effect of sodium could not be determined in the Italian randomized trial.3
Surprisingly, we found no association with potassium, which differs from what was previously observed in men1 and older women.2 Magnesium was also not associated with risk after controlling for other dietary factors, similar to previous findings in men and women but in contrast to findings from the Finnish study.17 However, in the study from Finland, the only other dietary factors that were adjusted for in the multivariate model were fiber and alcohol. Given the high correlation between the intake of magnesium and other dietary factors such as potassium, it is possible that magnesium would not have been significantly associated with risk if these other dietary variables were included in the multivariate model.
The incidence of nephrolithiasis varied by age, and was highest in the youngest age group (27-34 years). When we compared the incidence rates in the present study with those in corresponding age groups from 10 years earlier in the NHS I,2 the incidence of stone formation seems to be increasing among women. This is consistent with national trends.25 Although the specific reasons for this increase are unknown, dietary and other lifestyle factors are probably important contributors.
The information on dietary factors was obtained before the kidney stone was diagnosed; thus, biased recall of dietary intake was avoided. Although there may be other unmeasured lifestyle factors that relate to stone formation, we included in our multivariate models those variables that are believed to be most strongly related to stone formation. Incomplete information on the oxalate content of foods precluded us from performing analyses on the association between dietary oxalate intake and stone formation. We do not have information on 24-hour urine chemistry results from the whole cohort.
These findings are most directly generalizable to women younger than 50 years. The findings for dietary calcium intake are quite consistent with the literature; however, the results for some of the other dietary factors that we studied are quite different from what was observed in older women and men. These results underscore that care should be taken when attempting to generalize results across age and sex groups and that more studies are needed. Because the pathophysiological features of stone formation are believed to remain the same regardless of a history of nephrolithiasis, it is likely that these results apply to young women with a history of kidney stones as well.
In summary, our findings indicate that a higher intake of dietary calcium decreases the risk of kidney stone formation in younger women. The lack of an increased risk with greater intake of calcium and the potential to increase the risk with calcium restriction reinforce that the routine restriction of dietary calcium in patients who have had a kidney stone is no longer justified. Supplemental calcium was not associated with risk in this study; however, the risk of supplements in women who have had a kidney stone should be considered on an individual basis. This study also suggests that some dietary risk factors may differ by age and sex. Finally, dietary phytate may be a new, important, and safe addition to our options for stone prevention.
Corresponding author: Gary C. Curhan, MD, ScD, Channing Laboratory, Brigham and Womens' Hospital, 181 Longwood Ave, Boston, MA 02115 (e-mail: Gary.Curhan@channing.harvard.edu).
Accepted for publication May 30, 2003.
This study was supported by grants CA50385 and DK59583 from the National Institutes of Health, Bethesda, Md.