Time from estimated seroconversion date to neuropsychological impairment by CD4 cell count.
Time from estimated seroconversion date to neuropsychological impairment by plasma level of human immunodeficiency virus (HIV) RNA.
Time from estimated seroconversion date to neuropsychological impairment by risk group. The group with high plasma levels of human immunodeficiency virus RNA and low CD4 cell counts was compared with all other risk group combinations. High and low levels are described in the "Procedures" subsection of the "Methods" section.
Time from estimated seroconversion date to neuropsychological impairment split by cerebrospinal fluid levels of human immunodeficiency virus (HIV) RNA.
Marcotte TD, Deutsch R, McCutchan JA, Moore DJ, Letendre S, Ellis RJ, Wallace MR, Heaton RK, Grant I. Prediction of Incident Neurocognitive Impairment by Plasma HIV RNA and CD4 Levels Early After HIV Seroconversion. Arch Neurol. 2003;60(10):1406–1412. doi:10.1001/archneur.60.10.1406
Neuropsychological (NP) impairment is a relatively common sequela in human immunodeficiency virus (HIV)–infected individuals with advanced disease. Early antecedents of NP dysfunction, however, remain poorly understood.
To determine whether early markers of immunocompetence and viral replication in individuals who have undergone seroconversion would be of prognostic value in identifying subjects who would become cognitively impaired.
Seventy-four subjects with estimable seroconversion dates and normal cognition at baseline (a median of 1 year after seroconversion) received NP and laboratory evaluations, including reverse transcription–polymerase chain reaction measurements of plasma (N = 74) and cerebrospinal fluid (n = 47) levels of HIV RNA. Subjects were followed up longitudinally, and were considered to have reached the end point if they became cognitively impaired.
Using Kaplan-Meier estimates, the subgroups with the most rapid progression to NP impairment were (1) subjects with early reductions in CD4 counts (<400 cells/µL at baseline; P = .007) and (2) those with elevated plasma HIV RNA values (>4.5 log10 copies/mL; P = .03) early after seroconversion. Using proportional hazards modeling, the highest-risk subjects had both CD4 counts less than 400 cells/µL and HIV RNA levels greater than 4.5 log10 copies/mL (risk ratio, 6.0; P = .01). In most subjects (7/9 [78%]), NP impairment developed before an acquired immunodeficiency syndrome–defining illness.
Neurocognitive outcomes in HIV are strongly influenced by very early systemic virological and immunological events. Patients with high plasma levels of HIV RNA and low CD4 counts early after infection should be aggressively treated to prevent immunological decline and NP deterioration.
NEUROCOGNITIVE complications, ranging from clinically asymptomatic neurocognitive impairment to human immunodeficiency virus (HIV)–associated dementia, are well recognized in HIV disease. Although advanced HIV disease remains the strongest correlate of the onset of neuropsychological (NP) impairment,1- 3 the factors that may predict the development of such complications remain poorly understood.
The importance of CD4 levels in predicting the onset of an AIDS-defining illness is well documented, and methods for HIV quantitation also make it practicable to relate viral levels to clinical outcomes such as development of opportunistic infections and risk for death. However, the relationships between immunological and virological indicators and HIV-related NP impairment are less clear. A number of studies have failed to find a relationship between CD4 levels and cognitive functioning,4- 6 whereas others have reported a modest association.7- 9 Data on the relationship between plasma and cerebrospinal fluid (CSF) levels of HIV RNA and neurocognitive complications remain sparse. In cross-sectional studies, investigators found plasma RNA levels to be unrelated10 or marginally related11 to NP functioning. In both studies, CSF levels of HIV RNA were most strongly associated with cognitive performance. In contrast, 1 longitudinal study9 reported that low CD4 counts and high plasma levels of HIV RNA were robust predictors of future dementia.
There is thus some predictive power of CSF and plasma determinations of HIV RNA in regard to neurocognitive complications, especially in individuals with advanced disease (in the case of CSF), and in the prediction of full-blown dementia. Although investigators have reached varying conclusions,12 medically asymptomatic HIV-infected persons also may manifest at least mild impairment.1,4,7 These mild impairments are much more common than dementia and not necessarily unimportant. For example, persons with impairments short of frank dementia are more likely to be unemployed,13,14 to have reduced driving abilities,15 and to die sooner than NP-normal persons at the same stage of HIV infection.16,17
We sought to predict the onset of mild NP dysfunction in a group of HIV-infected persons who had recently undergone seroconversion and had not yet experienced an AIDS-defining opportunistic condition, ie, reached stage C as defined by the Centers for Disease Control and Prevention (CDC).18 We hypothesized that elevated HIV RNA levels or a precipitous decline in CD4 levels soon after infection would predict future neurocognitive complications, and that an algorithm incorporating both HIV RNA levels and CD4 counts would be the best predictor.
Unless otherwise indicated, data are expressed as mean ± SD.
Subjects were selected from the HIV Neurobehavioral Research Center Longitudinal Study.1 The project was approved by the University of California–San Diego institutional review board. Informed consent was obtained after all study procedures were fully explained. All subjects were HIV seropositive as determined by findings of enzyme-linked immunosorbent assay and confirmatory Western blot analysis. Subjects were excluded if they had a history of a non–HIV-related neurological disorder or a medical disorder that affected nervous system function (eg, seizure disorder or head trauma with >30 minutes loss of consciousness), current substance dependence, or a psychotic disorder.
For all participants, the dates were known for their last negative and first positive HIV antibody test results. They entered the study between June 18, 1990, and March 15, 1995, and follow-up ranged from March 11, 1991, to June 4, 1998. The time of seroconversion was estimated by the midpoint of the seroconversion interval19- 21 (the period between the last HIV-negative and the earliest HIV-positive test results). Only subjects with 3 years or less between the estimated seroconversion date and baseline assessment were selected. Evidence from other studies suggests that excluding subjects with moderately wide seroconversion intervals does not significantly alter time estimates.19,20
Only subjects who were not NP impaired at baseline and who had at least 1 follow-up visit were included in the study.
Sixty-eight (92%) of the 74 subjects were medically asymptomatic (CDC stage A); 6 (8%) were mildly symptomatic (CDC stage B). The mean CD4 cell count was 573 ± 225 cells/µL, with mean plasma levels of HIV RNA of 4.1 ± 0.9 log10 copies/mL (median, 4.1 log10 copies/mL), or approximately 12 000 copies/mL. Sixteen of the subjects (22%) were taking antiretroviral (ARV) medication at baseline. Forty-seven subjects underwent lumbar puncture (median HIV RNA level, 2.6 log10 copies/mL; range, undetectable to 4.1 log10 copies/mL). The median seroconversion interval was 11 months (range, 1 month to 4 years), and the median time from the estimated seroconversion date to baseline assessment was 1 year (range, 3 months to 3 years).
Subjects were followed up at approximately 12-month intervals. The CD4 lymphocyte counts were quantified using fluorescence-activated cell sorter. Levels of HIV RNA were quantified using a reverse transcription–polymerase chain reaction assay (Roche Amplicor HIV-1 Monitor Test; Roche Molecular Systems, Inc, Pleasanton, Calif). The standard assay was used for plasma quantitation (detection limit, 400 copies/mL), whereas an ultrasensitive version was used for CSF quantitation (detection limit, 50 copies/mL).
At baseline, subjects were classified as being in the high (≥400 cells/µL) or low-level (<400 cells/µL) CD4 group, such that the high-risk subgroup had experienced a fall in CD4 counts to a level at which initiation of ARV therapy is currently advised (350 cells/µL). Subjects were also dichotomized into high (>4.5 log10 copies/mL; approximately 30 000 copies) and low-level (≤4.5 log10 copies/mL) plasma HIV RNA groups. Approximately one third of the cohort was in the high-risk group. Hence, each subject was identified in 1 of the following 4 risk groups: (1) high CD4/low plasma HIV RNA levels; (2) high CD4/high plasma HIV RNA levels; (3) low CD4/low plasma HIV RNA levels; and (4) low CD4/high plasma HIV RNA levels.
Nine subjects underwent assessment within 6 months of estimated seroconversion. To ensure that these subjects had reached a stable HIV RNA level, we examined later visits for a subsequent drop in viral level. Eight of the 9 subjects had follow-up (12- to 18-month) plasma viral levels that were consistent with their baseline high/low plasma RNA level assignment, suggesting that they had achieved a set-point by their baseline visit. One subject had only a baseline plasma level available. Subjects who underwent measurement of CSF levels of HIV RNA were also divided into a high (>103 log10 copies/mL) or low-level (≤103 log10 copies/mL) group. Antiretroviral drug treatment was classified as ever vs never treated with ARV drugs, and undergoing vs not undergoing ARV treatment at baseline.
Using a detailed NP test battery,1 subjects were classified as NP impaired (study end point) if they evidenced impairment in at least 2 cognitive domains,22,23 as rated by a senior neuropsychologist (R.K.H.) who was masked to biological data.
Independent t tests were used for continuous variables and χ2 or Fisher exact tests for categorical variables. The distribution of time to impairment was characterized using Kaplan-Meier estimation.24 Survival distributions were compared using a log-rank test, with the time of origin being the estimated time of seroconversion. Proportional hazards regression25 was used to test for significance of risk categories and to adjust for baseline covariates that differed by group. Relative risks for NP impairment in risk groups compared with the reference group were estimated, and 95% confidence intervals (CIs) were calculated for risk ratios (RRs). We used S-Plus26 and JMP27 software for statistical analyses.
Nine (12%) of the 74 subjects became cognitively impaired, with the median time from estimated seroconversion to impairment being 3.2 years (range, 0.9-5.0 years). The NP impairment preceded an AIDS-defining illness for 7 (78%) of the 9 subjects.
The low-level CD4 group became NP impaired at a significantly more rapid rate than the high-level CD4 group (P = .007, 1-tailed; Figure 1). Elevated plasma levels of HIV RNA were also associated with earlier progression to cognitive impairment (P = .03, 1-tailed; Figure 2).
To see whether a model incorporating CD4 and HIV RNA levels would be a stronger predictor of future neurocognitive impairment than either variable alone, we compared subjects with both elevated HIV RNA levels and reduced CD4 levels with all other subjects. As predicted, this group became NP impaired significantly more rapidly than the others (P = .001, 1-tailed; Figure 3).
There were no differences in mean follow-up times for the 4 groups (2.8 ± 1.6 vs 2.2 ± 1.2 years, high vs low-level CD4 groups [P = .21]; 2.8 ± 1.6 vs 2.4 ± 1.4 years, low vs high-level plasma HIV RNA groups [P = .39]).
The comparison groups were similar across most, but not all, demographic and disease-related factors (Table 1). We therefore used proportional hazards modeling to control for variables that significantly differed between the 2 plasma HIV RNA or the 2 CD4 groups. Table 2 shows that the highest-risk group (high-level plasma HIV RNA/low-level CD4 group), when used alone (model 1), had significant effect on time to NP impairment (P = .01; RR, 6.0; 95% CI, 1.5-22.7). Addition of sex (model 2 [P = .30]) and former (model 3 [P = .28]) or current (model 4 [P = .19]) ARV therapy had no significant independent effect, whereas the overall model remained significant (P<.03).
Although the combination of education and risk group (model 5) had an overall effect on time to impairment (P = .01), the independent influence of each variable was marginally significant (P = .06 and P = .07, respectively). The influence of education was in the unexpected direction, indicating more impairment with higher education. The nearly equal, but marginal, significance for the risk group and education variables may be due to the fact that subjects with high plasma levels of HIV RNA tended to be more highly educated (Table 1), and both variables shared in the ability to estimate NP impairment. When the partial additional effect of education was estimated after controlling for risk group, education did not provide an additional significant contribution to the estimation of time to cognitive impairment (P = .06). Therefore, education was removed from the final model. The best model, model 1, resulted in a significant independent effect of the combination of high plasma HIV RNA with low CD4 levels, overshadowing other identifiable influences.
Although the primary purpose of this study was to determine baseline predictors of future impairment, we examined whether intervening treatment accounted for NP changes. Using conditional survival analysis, subjects receiving treatment before or up to the last visit became impaired at a rate similar to those who were never treated (P = .80). However, subjects receiving more than 1 medication at the last visit became impaired at a slower rate than those receiving only 1 medication (P = .04). Five of the 15 subjects receiving more than 1 medication were receiving combination antiretroviral therapy (CART); none of these subjects became impaired. The number of medications (>1 vs 1) provided a marginal contribution (P = .12) in predicting impairment when included in the model incorporating the group with a baseline high plasma HIV RNA and a low CD4 level (P = .03; overall model, P = .01), suggesting baseline immunological/virological status was still the most robust predictor of future impairment.
At the last visit, NP-impaired subjects had lower mean CD4 counts than NP-normal subjects (241 ± 144 vs 462 ± 200 cells/µL; P = .003). Follow-up plasma samples were available for 7 impaired and 44 unimpaired participants. The difference in mean HIV RNA levels between the impaired and unimpaired subjects was not significant (4.6 ± 1.1 vs 4.0 + 1.1 log10 copies/mL; P = .15). Since serum samples were available, we reran the analyses using serum HIV RNA levels when plasma levels were missing. This resulted in minimal change in the overall RNA levels, but increased power to detect differences (P = .07) between groups.
The group with high CSF levels of HIV RNA had lower CD4 counts, a longer time between the seroconversion date and baseline assessment, higher plasma levels of HIV RNA, and a higher prevalence of subjects with the CDC stage B classification (Table 3). This group was similar to the group with low CSF levels of HIV RNA in other respects. Eight (57%) of the 14 subjects with high CSF levels of HIV RNA had signs of pleocytosis (≥5 leukocytes/µL), compared with 11 (33%) of the 33 subjects with low CSF levels of HIV RNA (P = .13).
Three (6%) of 47 subjects became NP impaired: 2 from the group with high CSF levels, and 1 from the group with low CSF levels. The number of subjects reaching the study end point was insufficient to justify further analysis, although distributions of time to NP impairment are shown in Figure 4.
Virological and immunological events occurring in the months immediately after HIV infection are predictive of disease course and survival.28,29 In this study, early evidence of poor virological control and immunocompromise predicted earlier incidence of neurocognitive impairment. Subjects with plasma levels of HIV RNA of greater than 4.5 log10 copies/mL and CD4 counts less than 400 cells/µL in the year after seroconversion were 6 times more likely to become cognitively impaired. Inclusion of additional demographic and treatment variables did not significantly alter the finding. In most of these subjects (7/9 [78%]), cognitive impairment occurred before the development of an AIDS-defining condition.
Although increased impairment rates and greater immunocompromise are seen with advanced disease, previous studies attempting to relate neurocognitive functioning to CD4 levels have reported mixed results.1,4,5,7,30 When significant relationships between CD4 count and cognition are found (eg, Boccellari et al31), it is often when comparing the most immunocompromised subjects (CD4 count <200 cells/µL) with the most immunocompetent group (eg, CD4 count >400 or >500 cells/µL). Even these findings, however, are often tentative. Longitudinally, a 1-year decline in CD4 cell count has been associated with a decline in NP performance on memory and reaction time measures.8 Childs et al9 found that subjects with CD4 counts below 200 cells/µL had a relative hazard for HIV-associated dementia of 3.5, compared with subjects with CD4 counts greater than 500 cells/µL. Our finding that lowered CD4 levels are predictive of future neurocognitive impairment is generally in agreement with these studies. In contrast, Stern and colleagues6 reported that CD4 counts did not predict onset of HIV-associated dementia, the most severe HIV-related cognitive disorder, although subjects entered the study in an immunocompromised state and truncation of CD4 levels may have reduced the power to detect a CD4 effect.
Our finding that plasma levels of HIV RNA above 4.5 log10 copies/mL (about 30 000 copies/mL) place one at increased risk for NP impairment is consistent with the study by Childs and colleagues.9 These authors reported that, compared with individuals with fewer than 3000 copies/mL, those with HIV RNA levels of 3000 to 30 000 copies/mL had a relative hazard for dementia of 3.85. Those with greater than 30 000 copies/mL (4.5 log10 copies/mL) had a relative hazard of 8.5. We extend these findings by demonstrating that similarly elevated values approximately 1 year after seroconversion also significantly predict future impairment.
Levels of CD4 and HIV RNA provide useful information regarding the progression to AIDS and death.32 As suggested by Coffin,33 CD4 counts estimate current levels of immunocompetence, and HIV RNA levels predict the rate of future CD4 decline. Although HIV-related cognitive impairments are most common in individuals who experience an AIDS-defining illness,1 the combination of a reduced CD4 count and the "downhill velocity" indicated by an elevated RNA level appears to identify those who are at risk for impairment before these more severe opportunistic infections.
Subjects classified into the high- and low-level HIV RNA and CD4 groups at baseline were cognitively normal at the time, reinforcing the notion that the relationship between these markers and concurrent cognitive function remains tenuous, at least when values are not extreme. As expected, given the baseline predictors of NP impairment in this study, individuals who became impaired had significantly lower CD4 cell counts and higher plasma viral levels at their last visit compared with those who remained cognitively normal. This is consistent with what is known about the course of CD4 counts and the notion of a viral set-point. Nonetheless, the present findings suggest that, early in the course of the disease, CD4 cell counts and viral levels are predictive of incident NP impairment, suggesting that prolonged exposure to these conditions may predispose one to central nervous system (CNS) dysfunction.
The CSF has been hypothesized as a possible window into pathologic CNS processes. When plasma and CSF data on HIV RNA are available, CSF levels of HIV RNA have typically been the most strongly related to HIV-related cognitive disorders in cross-sectional10,11 and longitudinal studies.34 Two of the 14 subjects with high viral levels (>3.0 log10 copies/mL) in the CSF became impaired. In cases with CD4 levels below 200 cells/µL, a cut point of only 200 viral copies/mL in the CSF has been shown to predict future impairment.34 Applying this viral cut point to our relatively immunocompetent group (only 1 subject had a CD4 cell count of <200 cells/µL), 27 (57%) of our 47 subjects would be classified as having CSF levels of HIV RNA that would put them at risk for CNS dysfunction. Despite these elevations and a mean follow-up of 2.9 years, impairment developed in only 2 of the 27 subjects with elevated CSF values.
Although these findings appear somewhat surprising, they are consistent with the notion that CSF levels of HIV RNA in immunocompetent individuals may be the result of cellular trafficking (via pleocytosis) from the blood to the CSF.10 Since the RNA levels do not result from primary replication within the CSF, these levels do not correlate well with HIV-related cognitive disorders. In later stages of disease (AIDS), however, HIV virions in the CSF may arise from replication within the CSF and brain compartments, and therefore may be more closely associated with NP functioning. Since none of our subjects had advanced disease, very few became impaired, and we had insufficient power to examine this issue.
It is unclear whether lower CD4 counts or elevated plasma HIV RNA levels within a few years after seroconversion are indicative of a particularly virulent form of the virus, or whether host factors may cause accelerated disease progression. Similar CD4 levels likely have significantly different meaning in those individuals who reach them over the course of many years, since this suggests a more gradual decline. This may or may not be the case with RNA levels, since HIV-infected individuals reach a set-point after the initial peak viremia, and then RNA levels remain relatively constant during much of the disease process. Prolonged exposure to high plasma levels of HIV may lead to increased viral seeding within the CNS, ultimately resulting in HIV-related neurodegeneration. Alternatively, prolonged, elevated plasma levels of HIV may lead to a more rapid decline in CD4 cell counts, which also may predispose to cognitive impairment. Because only 9 of the 74 subjects in our cohort became impaired, the current study is not adequately powered to delineate the final pathway to cognitive dysfunction.
Clinicians should be vigilant to possible neurocognitive complications in individuals with early immunological and virological changes. This finding supports current recommendations that patients with high HIV RNA levels and low CD4 counts early after infection should be aggressively treated to prevent immunological decline and cognitive deterioration.
Although this study spanned the pre-CART and early CART eras, only 7 subjects were taking protease inhibitors during their time in the study, and we are unable to comment on the effect that CART has on CNS disease. The literature suggests that the dramatic impact of CART on survival has also translated to a reduction in the incidence of dementia35 and milder forms of neurocognitive impairment.36 Cognitive benefits appear to be greatest in those subjects who have reductions in CSF levels of HIV RNA.37 Early treatment with the newer ARV drugs may thus inhibit the development of the cognitive impairments that we found in this cohort. Nonetheless, since the protease inhibitors have poor blood-brain penetration, many investigators remain concerned that the CNS may be a sanctuary for replicating virus. Findings that dementia38 and HIV encephalitis39 rates have not declined as quickly as other opportunistic infections, and that the prevalence of mild to moderate HIV encephalopathy may have increased,40 suggest that individuals are living longer with HIV-related cognitive impairments and that CNS dysfunction remains a significant clinical concern.41
Detecting subtle brain dysfunction early in the disease may have practical significance. Recent reports indicate a relationship between viral levels in the brain and dementia severity.42 In addition, Masliah et al43 demonstrated that reduction in dendritic complexity could be found at autopsy in those with asymptomatic (clinically subsyndromic) NP impairment during life, as well as in subjects with minor cognitive motor disorder or frank HIV dementia, and Cherner et al44 showed that the presence of neurocognitive impairment in patients with AIDS predicted detection of HIV encephalitis at autopsy. Such data suggest that even milder forms of NP impairment may be an indicator of underlying neuropathological events. Early intervention with ARV therapy in those at highest risk for even mild impairment may help prevent underlying, HIV-induced brain injury.
Corresponding author and reprints: Thomas D. Marcotte, PhD, HIV Neurobehavioral Research Center, University of California–San Diego, 150 W Washington, Second Floor, San Diego, CA 92103 (e-mail: firstname.lastname@example.org).
Accepted for publication May 30, 2003.
Author contributions: Study concept and design (Drs Marcotte, Deutsch, McCutchan, Ellis, Heaton, and Grant); acquisition of data (Drs McCutchan, Letendre, Ellis, and Wallace and Mr Moore); analysis and interpretation of data (Drs Marcotte, Deutsch, McCutchan, Letendre, Ellis, Heaton, and Grant and Mr Moore); drafting of the manuscript (Drs Marcotte, Deutsch, McCutchan, Ellis, and Heaton); critical revision of the manuscript for important intellectual content (Drs Marcotte, Deutsch, McCutchan, Letendre, Ellis, Wallace, Heaton, and Grant and Mr Moore); statistical expertise (Drs Marcotte and Deutsch); obtained funding (Drs McCutchan, Ellis, Heaton, and Grant); administrative, technical, and material support (Drs Marcotte, McCutchan, Letendre, Ellis, Wallace, Heaton, and Grant and Mr Moore); study supervision (Drs Marcotte, McCutchan, and Grant).
The HIV Neurobehavioral Research Center is supported by center award MH 62512 from the National Institute of Mental Health, Rockville, Md.
The San Diego HIV Neurobehavioral Research Center group is affiliated with the University of California–San Diego, the US Naval Medical Center–San Diego, and the San Diego Veterans Affairs Healthcare System. The group includes the following participants: Igor Grant, MD (director); J. Hampton Atkinson, MD, and J. Allen McCutchan, MD (codirectors); Thomas D. Marcotte, PhD (center manager); Mark R. Wallace, MD (principal investigator [PI]; US Naval Medical Center–San Diego); J. Allen McCutchan, MD (PI), Ronald J. Ellis, MD, Scott Letendre, MD, and Rachel Schrier, PhD (neuromedical component); Robert K. Heaton, PhD (PI), Mariana Cherner, PhD, Julie Rippeth, PhD, Joseph R. Sadek, PhD, and Steven Paul Woods, PsyD (neurobehavioral component); Terry Jernigan, PhD (PI), John Hesselink, MD, and Michael J. Taylor, PhD (imaging component); Eliezer Masliah, MD (PI) and Dianne Langford, PhD (neuropathology component); J. Allen McCutchan, MD, J. Hampton Atkinson, MD, Ronald J. Ellis, MD, PhD, and Scott Letendre, MD (clinical trials component); Daniel R. Masys, MD (PI), and Michelle Frybarger, BA (data systems manager) (data management unit); and Ian Abramson, PhD (PI), Reena Deutsch, PhD, and Deborah Lazzaretto, MA (statistics unit).
We thank Deborah Durand, Stephen A. Spector, MD, and Karen Hsia, PhD, for providing viral load determinations, and Julie Nelson for assistance with data extraction.
The views expressed in this article are those of the authors and do not reflect the official policy or position of the Department of the Navy, the Department of Defense, or the US government.