Cumulative graft survival for recipients by donor age. Five-year graft survival for donors younger than 60 years was 72%, and for donors 60 years or older, 35% (P<.001).
Cumulative graft survival for recipients by graft cold ischemia time (CIT). Five-year graft survival for grafts with less than 12 hours of CIT was 71%, and for grafts with 12 hours or more of CIT, 58% (P = .004).
Cumulative graft survival by recipient status. Five-year graft survival for status 2B and 3 recipients was 71%, and for status 1 and 2A recipients, 60% (P = .02).
Comparison of cumulative graft survival for 2 hypothetical recipients with different characteristics. CIT indicates cold ischemia time.
Customize your JAMA Network experience by selecting one or more topics from the list below.
Moore DE, Feurer ID, Speroff T, et al. Impact of Donor, Technical, and Recipient Risk Factors on Survival and Quality of Life After Liver Transplantation. Arch Surg. 2005;140(3):273–277. doi:10.1001/archsurg.140.3.273
Donor, technical, and recipient risk factors cumulatively impact survival and health-related quality of life after liver transplantation.
Tertiary care center.
A total of 483 adults undergoing primary orthotopic liver transplantation between January 1, 1991, and July 31, 2003.
Main Outcome Measures
Graft and patient survival, Karnofsky functional performance scores, Medical Outcomes Study Short Form 36 Health Survey scores, and Psychosocial Adjustment to Illness Scale scores as influenced by potential risk factors including donor age, weight, warm ischemia time, cold ischemia time (CIT), sex, United Network for Organ Sharing (UNOS) status (1 or 2A vs 2B or 3), recipient age and disease, bilirubin level, and creatinine level.
Five-year graft survival was 72% for recipients of donors younger than 60 years and 35% for recipients of donors 60 years and older (P<.001). A CIT of 12 hours or more was associated with shorter 5-year graft survival (71% vs 58%; P = .004). Five-year graft survival for UNOS status 2B or 3 was 71% vs 60% for status 1 or 2A (P = .02). A comparable pattern was seen for patient survival in relation to donor age (P = .003), CIT (P = .005), and urgency status (P = .03). Urgent UNOS status, advanced donor age, and prolonged CIT were independently associated with shorter graft and patient survival (P<.05). Functional performance and health-related quality of life were not affected by donor, recipient, or technical characteristics.
Combining advanced donor age, urgent status, and prolonged CIT adversely affects graft and patient survival, and the cumulative effects of these risk factors can be modeled to predict posttransplant survival.
Almost 2000 people with end-stage liver disease die every year while awaiting a suitable donor liver. Since 1991, there has been a 10-fold increase in the number of liver transplant candidates and only a 2-fold increase in the number of donor livers. In 2000, there were almost 17 000 candidates awaiting liver transplant and only about 5000 transplanted.1 Perhaps even more striking is a 10% to 15% annual mortality for patients awaiting liver transplant. Given these grim statistics, allocation of organs must be carefully considered to maximize graft and patient survival.
In an effort to expand the donor pool, organs from marginal donors (with presumed marginal grafts) have been increasingly transplanted. Marginal livers are considered to be less optimal for transplantation for a variety of reasons, and there is no consistent definition of a marginal donor in the literature. The characteristics given in the following list are cited by many.2-6 Donors with one or more of these characteristics account for 15% to 20% of transplanted livers.7 In addition, a variety of recipient and technical risk factors relating to impaired graft and patient survival have been identified8:
Donor risk factors
Age (>50 y, >60 y)
Weight (>100 kg)
Intensive care unit stay (>4 d)
Hypotensive episodes (mean arterial pressure <60 mm Hg for >1 h)
Vasopressor drug requirement (dopamine dosage >10 μg/kg per minute)
Elevated bilirubin level (>2 mg/dL [>34.2 μmol/L])
Elevated alanine aminotransferase level (>170 U/L)
Elevated aspartate aminotransferase level (>140 U/L)
Elevated serum sodium level (>160 mEq/L)
Technical risk factors
Prolonged cold ischemia time (CIT) (>12 h)
Prolonged warm ischemia time (>60 min)
Graft-to-recipient sex mismatch (female graft to male recipient)
Recipient risk factors
Pretransplant urgency status
Total bilirubin level
Cause of liver disease
Donor, recipient, and technical characteristics have been related to recipient outcomes after transplantation, but few studies have examined the combined effects of these factors on graft and patient survival.9,10 In addition, little has been done to evaluate the influence of these factors on health-related quality of life. Our purpose was to determine the effects of donor, technical, and recipient risk factors on graft and patient survival as well as health-related quality of life after liver transplantation.
This study was approved by the institutional review board of Vanderbilt University Medical Center, Nashville, Tenn. Data were derived from our transplantation center’s registry, a database maintained for clinical outcomes evaluation and mandatory reporting purposes. All adult patients (aged ≥18 years) who underwent primary orthotopic liver transplant for end-stage liver disease at Vanderbilt University Medical Center between January 1, 1991, and July 31, 2003, were included in the analysis. Donor, recipient, technical, and survival data were collected from the registry. Functional performance data were collected by a review of medical records, and health-related quality-of-life outcome data were collected by surveying organ recipients by means of standardized self-report instruments.
Risk factors determined to be meaningful predictors of donor, technical, and recipient outcomes were selected on the basis of a review of the literature. Initial data analysis using receiver operating characteristic curves and looking at survival trends allowed us to categorize variables such as donor age and CIT into dichotomous variables with discrete, clinically meaningful cutoff points. Donor risk factors were age (≥60 years) and weight (≥100 kg). Technical characteristics included CIT (≥12 hours), warm ischemia time (≥60 minutes), and graft mismatch (female graft to male recipient vs all other combinations). Recipient risk factors were United Network for Organ Sharing (UNOS) urgency status (1 or 2A vs 2B or 3), age, creatinine level, total bilirubin level, and cause of liver disease. All summary data are presented as mean ± SD.
Primary outcome measures were graft survival, patient survival, and rate of retransplantation. In addition, the effects of all risk factors on Karnofsky functional performance and health-related quality of life were evaluated in a subset of patients.11 Karnofsky scores were determined longitudinally (before transplant, 3 and 6 months after transplant, and annually thereafter). The Karnofsky scale is a clinician-scored measure of functional performance. Scores range from 0 to 100. A score of 40 or less represents the poorest level of functioning (“disabled”), 41 to 79 represents impaired functioning (“unable”), and 80 to 100 represents the highest level of functioning (“able”).
Self-report of health-related quality of life was obtained one time after transplant by means of the Medical Outcomes Study Short Form 36 Health Survey (SF-36)12 and the Psychosocial Adjustment to Illness Scale.13 The SF-36 consists of 36 items in 8 subscales: physical functioning, role functioning, bodily pain, general health, vitality, social functioning, role-emotional, and mental health. The physical and mental component summary scales are then computed as weighted composites of the 8 subscales. Normative data for the general US population are available for the SF-36. Higher scores represent better quality of life.
The Psychosocial Adjustment to Illness Scale is composed of 46 items, which are scored as 7 domains of psychosocial adjustment to illness: health care orientation, vocational environment, domestic environment, sexual relationships, extended family relationships, social environment, and psychological distress. A global score is also computed. Higher scores represent poorer adjustment to illness.
Graft and patient survival curves for each risk group were estimated by means of Kaplan-Meier survival analysis, and the log-rank test was used to determine whether a statistically significant difference (defined as a P value <.05) existed between risk groups. Survival was determined in months from date of transplantation to patient death from any cause, retransplantation, or loss to follow-up (censored). Graft failure was defined as death from any cause or retransplantation. Retransplantation rates were compared between risk factor groups by means of the χ2 statistic.
The adequacy of our cutoff points for donor age and CIT were evaluated by first examining both continuous variables by quartiles and comparing survival by each quartile and also by using receiver operating characteristic curves to determine the most adequate point to categorize these variables. The optimal point for differentiating survival for donor age was 60 years, and the optimal point for differentiating survival for CIT was 12 hours.
Analysis of variance methods were used to test the effects of risk factors on functional performance and health-related quality of life. A mixed-model approach was used for longitudinal functional performance data to determine whether the pattern of improvement in functional performance over time differed by risk factor. Time after transplant was included as a covariate in testing the effects of risk factors on SF-36 and Psychosocial Adjustment to Illness Scale scores to control for individual differences within this measure.
Cox proportional hazards regression analysis was performed to test the combined effects of risk factors on outcome measures. Risk factors, including donor age, donor weight, CIT, warm ischemia time, sex mismatch, UNOS status, recipient age, total bilirubin level, creatinine level, and cause of liver disease, were treated as categorical variables in these analyses. Stepwise Cox proportional hazards regression with backward elimination was used to model the combined effects of these risk factors on graft and patient survival. All 12 covariates were entered in the initial model, and subsequent models were iteratively developed after elimination of risk factors that were not statistically significant predictors (defined as P<.05) of graft failure or patient death. Hazard functions derived from Cox proportional hazards regression models of graft and patient survival were used to derive risk scores and theoretical survival functions based on a variety of risk factor combinations. Multivariate models of functional performance and health-related quality of life were developed by multiple regression methods.
Four hundred eighty-three cadaveric liver transplants in 451 recipients from January 1, 1991, to July 31, 2003, were studied. Donor, recipient, and technical characteristics are summarized in Table 1.
A significant reduction in patient and graft survival was associated with donor age of 60 years or older and CIT of 12 hours or more. Likewise, UNOS status of 1 or 2A was a significant predictor for worse graft and patient survival. Five-year graft survival for recipients of donors younger than 60 years was 72% compared with 35% for recipients of livers from donors 60 years or older (P<.001) (Figure 1). Five-year survival for patients receiving grafts from donors younger than 60 years was 75% vs 48% for those receiving grafts from donors 60 years or older (P = .003). The 5-year graft survival for recipients of grafts with less than 12 hours of CIT was 71% compared with 58% for those receiving grafts with 12 hours or more of CIT (P = .004) (Figure 2), and 5-year patient survivals for these recipients were 77% and 64% (P = .005). The 5-year graft survival for status 2B and 3 recipients was 71% compared with 60% for status 1 and 2A recipients (P = .02) (Figure 3), whereas 5-year patient survival rates for these recipients were 75% and 65% (P = .03). Donor weight, warm ischemia time, sex mismatch, recipient age, creatinine level, total bilirubin level, and cause of liver disease did not significantly alter graft or patient survival. The retransplantation rate for grafts from donors younger than 60 years was 6%, whereas the retransplantation rate was 21% for grafts from donors 60 years or older (P<.001). The retransplantation rate for status 2B and 3 recipients was 7%, whereas the retransplantation rate for status 1 and 2A recipients was 21% (P = .01).
Primary nonfunction accounts for the majority of graft loss; therefore, a significant decrement in survival can be seen for recipients of marginal organs during the first year, with a more gradual loss of grafts in the ensuing years to chronic rejection. There was still significantly more graft loss at 5 or more years in recipients of donors 60 years or older than in recipients of donors less than 60 years old. A very small number of recipients died of causes unrelated to graft failure (<1%).
Health-related quality of life was measured in 75 recipients. There was a significant improvement in functional performance between pretransplant scores (53 ± 4) and 1-year posttransplant scores (87 ± 1)(P<.001). However, analyses demonstrated no difference in the rate of improvement in functional performance due to any risk factor.
Donor age of 60 years or older (P<.001), CIT of 12 hours or more (P = .007), and urgent recipient status (P = .007) were independent risk factors for shortened graft survival (Table 2). Likewise, donor age of 60 years or older (P = .03), CIT of 12 hours or more (P = .02), and urgent recipient status (P = .05) were independent risk factors for shortened patient survival on the basis of our Cox proportional hazards regression models. Using this modeling, each patient was given a risk score: risk score = β1X1 + β2X2 + . . . βkXk, where X1, X2, . . . Xk are the levels of k prognostic variables (risk factors), and β1, β2, . . . βk are regression coefficients.14 High scores correspond to poor prognosis, and smaller values (including negative ones) correspond to better prognosis. Risk of graft failure was expressed by following risk score in the model: 0.970 (donor age ≥60 years) + 0.616 (CIT ≥12 hours) + 0.634 (UNOS status 1 or 2A). Similarly, the risk score for patient death was as follows: 0.652 (donor age ≥60 years) + 0.572 (CIT ≥12 hours) + 0.529 (UNOS status 1 or 2A).
We next generated hypothetical graft survival curves by using hazard functions defined by the model and substituting various values for risk factors (Figure 4). On the basis of these hypothetical models, a graft transplanted from a donor younger than 60 years into a status 2B or 3 recipient with less than 12 hours of cold ischemia would have a probability of 75% for surviving 5 years. In contrast, a graft from a donor 60 years or older transplanted into a status 1 or 2A recipient with a CIT of 12 hours or more would have only a 20% probability of surviving 5 years after transplant.
Orthotopic liver transplantation is the standard of care for end-stage liver disease, but the supply of donor livers is inadequate. In an effort to increase the availability of donor organs, marginal donors are being used for liver transplantation, and this has consequences. Marginal grafts exhibit more profound ischemic and preservation injury.15 There is an increased incidence of rejection, initial poor function, and graft loss in livers from marginal donors.16 With these factors in mind, we investigated the relationship between marginal donors, technical factors associated with transplantation, and recipient status. Our results not only confirmed that these risk factors adversely affect survival, but they also demonstrate that donor age, CIT, and recipient UNOS status 1 or 2A are independently associated with shorter graft and patient survival in a combined-effects model, and the effect of these risk factors is additive.
Graft and patient survival worsens with the use of donor livers having greater than 30% steatosis.17 Because biopsy-quantified graft steatosis was not available for all grafts, we used donor weight of 100 kg or more as a surrogate marker for increased graft steatosis. Although this donor weight has been associated with increased graft loss in other studies,18,19 we found no association of donor weight with shortened graft or patient survival. On the basis of this result, donor weight may be an unreliable marker for graft steatosis.
Excessive warm ischemia time has been associated with worse graft survival,6 but we did not demonstrate a survival disadvantage with warm ischemia time of 60 minutes or more. In our study, the number of grafts subjected to this warm ischemia time was low and perhaps not powerful enough to exert a significant influence on our findings. In today’s practice, advances in surgical technique have lessened the importance of warm ischemia time as a predictor for patient and graft survival.
We had previously demonstrated that functional performance is significantly improved after liver transplantation.20 Because we were able to perform our analysis of health-related quality of life only on a subset of 75 recipients, these results are difficult to meaningfully interpret. Quality of life may be related to risk factors such as donor age and weight, but this was not observed in our present study.
Predicting posttransplant outcomes has become increasingly important in light of the worsening organ shortage. Several groups have attempted to identify preoperative and operative factors that predict survival after transplantation. Markmann and colleagues21 demonstrated that lack of immediate bile production by the graft, platelet transfusion of 20 U or more, and recipient urine output of 2.0 mL/kg per hour or less portend a poor outcome after liver transplantation. In addition, they modeled posttransplant survival on the basis of 4 pretransplant variables (recipient age, mechanical ventilation, dialysis, and retransplantation).22 Ghobrial and colleagues23 evaluated patient survival in a subset of 500 transplant recipients with hepatitis C and found that recipient age, UNOS status, donor sex, and creatinine level significantly predicted patient survival.
By incorporating hypothetical hazard values into a Cox proportional hazards regression model, we showed the cumulative adverse effects of increasing donor age, CIT, and UNOS status on graft survival. We also demonstrated similar findings for increasing donor age, CIT, and urgent UNOS status on patient survival. From these findings, we can develop preliminary models of pretransplant characteristics to help predict postransplant survival. In future studies we hope to confirm our model in the UNOS database and formulate a clinically relevant pretransplant model by using easily accessible variables that will accurately predict posttransplant graft and patient survival. Such a model could be used to make recipient-specific organ allocation decisions at the time of graft procurement.
Correspondence: Derek E. Moore, MD, MPH, Vanderbilt University Transplant Center, 801 Oxford House, Nashville, TN 37232-4753 (firstname.lastname@example.org).
Accepted for Publication: September 15, 2004.
Funding/Support: This research was supported by Roche Laboratories Inc, Nutley, NJ, and by grant 1 RO3 HS13036 from the Agency for Healthcare Research and Quality, Rockville, Md.
Previous Presentation: This study was presented in part at the Americas Hepato-Pancreato-Biliary Congress; February 28, 2003; Miami Beach, Fla.
Create a personal account or sign in to: