Morphometric assessment has emerged as a strong predictor of postoperative morbidity and mortality. However, a gap exists in translating this knowledge to bedside decision making. We introduced a novel measure of patient-centered surgical risk assessment: morphometric age.
To investigate the relationship between morphometric age and posttransplant survival.
Medical records of recipients of deceased-donor liver transplants (study population) and kidney donors/trauma patients (morphometric age control population).
A retrospective cohort study of 348 liver transplant patients and 3313 control patients. We assessed medical records for validated morphometric characteristics of aging (psoas area, psoas density, and abdominal aortic calcification). We created a model (stratified by sex) for a morphometric age equation, which we then calculated for the control population using multivariate linear regression modeling (covariates). These models were then applied to the study population to determine each patient's morphometric age.
Data Extraction and Synthesis
All analytic steps related to measuring morphometric characteristics were obtained via custom algorithms programmed into commercially available software. An independent observer confirmed all algorithm outputs. Trained assistants performed medical record review to obtain patient characteristics.
Cox proportional hazards regression model showed that morphometric age was a significant independent predictor of overall mortality (hazard ratio, 1.03 per morphometric year [95% CI, 1.02-1.04; P < .001]) after liver transplant. Chronologic age was not a significant covariate for survival (hazard ratio, 1.02 per year [95% CI, 0.99-1.04; P = .21]). Morphometric age stratified patients at high and low risk for mortality. For example, patients in the middle chronologic age tertile who jumped to the oldest morphometric tertile have worse outcomes than those who jumped to the youngest morphometric tertile (74.4% vs 93.2% survival at 1 year [P = .03]; 45.2% vs 75.0% at 5 years [P = .03]).
Conclusions and Relevance
Morphometric age correlated with mortality after liver transplant with better discrimination than chronologic age. Assigning a morphometric age to potential liver transplant recipients could improve prediction of postoperative mortality risk.
Among the most challenging jobs for a transplant clinician is deciding suitability for transplant. At the bedside, the clinician relies on a subjective patient assessment often termed the “eyeball test.” The results of this assessment are frequently communicated with phrases such as “The patient looks chronically ill and older than his/her stated age.” In an effort to measure the subjective eyeball test results objectively, previous studies1-3 have demonstrated that morphometric characteristics measured on computed tomographic scans predict surgical morbidity and mortality. Morphometric measures represent a powerful tool for risk prediction, but they are challenging to implement in a clinical setting. Bridging this gap is a critical step in bringing morphometric assessment to the bedside.
Patient age is often included in the objective assessment, but risk varies widely among patients of the same chronologic age.4 By identifying patients who morphometrically diverge from their chronologically aged peers, we seek to improve patient-centered surgical decision making. With this work, we leverage the objective risk prediction provided by morphometric assessment with the intuitive risk inherent in chronologic age to develop a novel measure for quantifying overall health: morphometric age.
Because use of a limited resource is inherent in liver transplantation, optimizing patient and graft outcomes is important.5,6 With this study, we investigated the relationship between morphometric age and posttransplant survival of the recipients. We hypothesized that morphometric age—a novel risk assessment tool—would correlate significantly with poor outcomes.
Previous work has described the morphometric data collection methods in detail.2 Briefly, the cross-sectional area and mean density in Hounsfield units (HU) of the left and right psoas muscles at the level of the L4 vertebra were measured. Abdominal aortic (AA) calcification was measured as the percentage of the total aortic wall area containing calcification in the infrarenal aorta from L1 to L3. Calcification was selected in a semiautomatic fashion with an HU value at least 25% greater than HU of the aortic lumen. Any remaining calcification was captured with manual adjustments.
All analytic morphometric steps were completed using custom algorithms programmed in commercially available computational software (MATLAB, version 13.0; MathWorks). All algorithm outputs were visually confirmed by an image processor.
Determining Morphometric Age in a Control Population
The baseline morphometric characteristics of aging were determined using a pool of control patients consisting of 1624 potential kidney donors and 1689 randomly sampled trauma patients. Using multivariate linear regression, age was modeled as a function of the morphometric characteristics of interest (psoas area, psoas density, and AA calcification). Separate models were created for men and women as described previously.3 These models were then applied to the study population to determine each patient’s morphometric age.
Our study population consisted of all adults (aged ≥18 years) who received liver transplants from deceased donors at the University of Michigan from January 1, 2000, through December 31, 2011, and who had a suitable 90-day preoperative computed tomographic scan. Patient factors that were collected included chronologic recipient age, sex, race, body mass index, preoperative serum albumin level, preoperative Model for End-Stage Liver Disease (MELD) score components, positive findings for hepatitis C virus, smoking in the year before the transplant, donor age, and presence of portal hypertension/cirrhosis, hepatocellular carcinoma, diabetes mellitus, and hypertension requiring medication.
The primary outcome measures for this study consisted of survival at 1 and 5 years after transplant. Mortality was ascertained by referencing the Social Security Master Death File.
Descriptive statistics were computed for the study cohort. Continuous variables were summarized by mean and standard deviation, and frequency tables were produced for categorical variables. Continuous variables were compared using a 2-sided t test, whereas categorical variables were compared using the Fisher exact test.
We determined the covariate-adjusted effect of morphometric age through standard survival analysis using a Cox proportional hazards regression model.7 Patients began follow-up at the time of liver transplant and continued until death or loss to or unavailability for follow-up, whichever occurred first. Multivariate logistic regression was used to determine the covariate-adjusted survival rates. After adjustment, patients were stratified into quintiles of morphometric age and chronologic age to compare each as a predictor of survival.
We then assessed the implication of age adjustment by comparing patients of the same chronologic age group who “jumped” into older and younger groups after morphometric age adjustment. For this analysis, we compared patients from the middle chronologic age tertile who jumped into the oldest or youngest morphometric age tertiles. These comparisons used a 2-sided t test or the Fisher exact test.
All analysis was performed using commercially available statistical software (SAS, version 9.1; SAS Institute, Inc). A 2-sided significance of α = .05 was used for all analysis. This study was approved by the University of Michigan institutional review board with a waiver of informed consent for participants.
Overall, 785 adult patients underwent deceased-donor liver transplant from 2000 to 2011. Of these patients, 348 received a 90-day preoperative computed tomographic scan that included the regions of interest for morphometric assessment. These patients served as the study cohort.
Descriptive statistics of the study population, stratified by tertiles of chronologic age, are shown in Table 1. The mean chronologic age at transplant was 51.4 (10.0) years, and 215 patients (61.8%) were male. The mean donor age was 38.9 (16.6) years. Mean laboratory MELD score at the time of transplant was 18.7 (7.7). For male patients, the mean total psoas area was 2310.7 (629.3) mm2, mean psoas density was 48.8 (9.5) HU, and mean AA calcification was 5.9% (7.8%). Among the male patients, 133 (61.9%) had some degree of AA calcification. Among the female patients, the mean total psoas area was 1456.0 (413.3) mm2, mean psoas density was 49.1 (10.5) HU, and mean AA calcification was 4.7% (8.9%). Sixty-six (49.6%) had some degree of AA calcification. After morphometric adjustment, we computed descriptive statistics based on tertiles of morphometric age, as shown in Table 2. Overall, 284 patients (81.6%) had a morphometric age greater than their chronologic age.
We used the Cox proportional hazards regression model to determine whether morphometric age is a significant independent predictor of survival. All available covariates were entered into the model, and the subset of adjustment covariates was chosen by backward selection. We used morphometric and chronologic ages as continuous variables in the model. The model showed that morphometric age is a significant independent predictor (hazard ratio [HR], 1.03 per morphometric year [95% CI, 1.02-1.04; P < .001]), as were hepatitis C virus positivity (1.93 [1.31-2.84; P = .001]) and female sex (1.68 [1.13-2.50; P = .01]). Diabetes mellitus was nearly significant (HR, 1.48 [95% CI, 0.99-2.20; P = .053]). Chronologic age was not a significant covariate (HR, 1.02 per year [95% CI, 0.99-1.04; P = .21]) but was kept in the model to compare its effect with that of morphometric age. All other covariates, including individual MELD components and donor age, were insignificant and were not selected in the final model.
Figure 1 shows the covariate-adjusted HR8 for morphometric and chronologic ages across their observed ranges. The reference value is the median of each age, that is, 53.1 years for chronologic age and 69.7 years for morphometric age. The HR for a patient at the 75th percentile of morphometric age was 1.50 compared with the median morphometric age while holding the other model covariates equal. In contrast, a patient aged 58.4 years (75th percentile) has an HR of 1.08 compared with a patient of median chronologic age.
Three hundred forty-eight patients reached the 1-year follow-up, with an 85.1% survival rate. Patients whose morphometric age was greater than their chronologic age had significantly lower 1-year survival rates than those whose morphometric age was less than their chronologic age (83.1% vs 93.8% [P = .03]).
Multivariate logistic regression was then used with 1-year mortality as the binary outcome. We entered all available covariates into the model, and the subset of adjustment covariates was chosen by backward selection. Morphometric age was a significant predictor (odds ratio [OR], 1.04 [95% CI, 1.03-1.06; P < .001]), as was female sex (1.83 [1.00-3.34; P = .049]). Chronologic age was not significant (OR, 1.03 [95% CI, 0.99-1.07; P = .15]), nor was hepatitis C virus positivity (1.81 [0.94-3.50; P = .08]). The area under the receiver operating characteristic curve for this model was 0.75. Figure 2A shows the covariate-adjusted 1-year survival rate for patients stratified by tertiles of morphometric and chronologic age. The adjusted survival rate of the chronologically youngest tertile was 90.4% compared with 78.9% for the chronologically oldest, and it was 94.0% for the morphometrically youngest compared with 74.1% for the morphometrically oldest tertile.
Two hundred twenty-seven patients reached 5-year follow-up, with a 58.6% survival rate. Patients whose morphometric age was greater than their chronologic age (n = 190) had a significantly lower 5-year survival rate than those whose morphometric age was less than their chronologic age (54.7% vs 78.4% [P = .01]).
Multivariate logistic regression showed that morphometric age was a significant predictor (OR, 1.03 [95% CI, 1.02-1.06; P < .001]), as was female sex (2.42 [1.32-4.43; P = .004]) and hepatitis C virus positivity (1.85 [1.03-3.31; P = .04]). Chronologic age was not significant (OR, 1.01 [95% CI, 0.98-1.04; P = .66]). The area under the receiver operating characteristics curve for this model was 0.70. Figure 2B shows the covariate-adjusted 5-year survival rate for patients stratified by tertiles of morphometric and chronologic age. The youngest chronologic tertile had an adjusted survival rate of 65.4% compared with 55.6% for the oldest chronologic tertile, whereas the morphometrically youngest tertile had 74.4% adjusted survival compared with 46.5% for the morphometrically oldest tertile.
Clinical Implications of Morphometric Age Adjustment
We then assessed the clinical implications of morphometric age adjustment for patients within the middle tertile of chronologic age. The chronologic age of these patients ranged from 49.7 to 56.7 years. Thirty-three patients in the middle chronologic tertile (28.4%) remained in the middle morphometric tertile. Patients in the middle chronologic tertile who jumped into the oldest morphometric tertile (n = 39 [33.6%]) had significantly inferior outcomes compared with patients in the middle chronologic tertile who jumped into the youngest morphometric tertile (44 patients [37.9%]) (74.4% vs 93.2% survival at 1 year [P = .03]; 45.2% vs 75.0% at 5 years [P = .03]) (Figure 3). We found no significant difference in chronologic age (53.4 vs 53.2 years [P = .59]) or donor age (42.2 vs 41.0 years [P = .73]) between these groups of patients.
With this study, we introduce a novel approach to the quantification of postoperative mortality risk after liver transplant: morphometric age. Serving as an intuitive surrogate to the overall health of the patient, morphometric age does not use standard patient-level variables but rather predicts surgical risk based on objective patient measures gleaned from cross-sectional imaging. In this way, we attempt to quantify the subjective findings of the eyeball test. Our data suggest that morphometric age is correlated with risk of postoperative mortality. For example, if patients move to a morphometric age tertile older than their chronologic age, the mortality risk is nearly 3 times higher at 1 year compared with that for patients who moved to a younger morphometric tertile.
The body of literature supporting the predictive ability of morphometric assessment is growing rapidly in surgical forums.1,2,9-13 Evidence suggests that sarcopenic patients have poorer outcomes compared with their chronologically aged peers.1-3 Measurement of core muscle size and density and AA calcification in these patients serves as a surrogate to their overall health, which was previously unquantifiable. As morphometric assessment continues to gain traction, these findings need to be translated into the clinical arena. Morphometric age may help to bridge this gap between investigation and clinical implementation. By combining morphometrics with an easily understood measure using age, we have created a metric that is broadly applicable from the emergency department to the surgical clinic. Furthermore, morphometrics can add value to pretransplant cross-sectional imaging.
The findings of our study must be taken into consideration with several limitations. First, our morphometric age calculation requires that a given patient has undergone cross-sectional imaging before the clinic visit or transplant. Also, our single-center experience does not prove generalizability to a larger transplant population but introduces the concept of this novel risk model. Imbedded in the methods of our model, we are comparing liver transplant patients with a predetermined healthy population of living kidney donors and trauma patients. Although we believe that this method would be the best for cross comparison, it could also limit the generalizability of our model. Age is a complex physiological process that cannot be distilled into 3 morphometric measures; thus, further work is needed to augment the variables that inform morphometric age.
The translation of morphometric assessment in organ transplantation has several possibilities. As previously discussed, the broadest application lies in quantifying suitability for the transplant procedure. Morphometric assessment can be brought to the bedside to counsel patients on the risks of surgery. Morphometric age may inform utility-based allocation schemes.5 Age-based allocation policy disfavors the physiologically robust older candidates while failing to eliminate patients who are younger but physiologically fragile. When morphometric age is added to risk prediction models, chronologic age falls out of the model as independently associated with mortality. Despite the inclusion of morphometric age into risk models, chronologic age bias will likely still exist. By increasing recognition of older adults who are physiologically robust, we hope that morphometric age may add a dimension of risk prediction, helping to mitigate age-related bias.
By leveraging the power of perioperative computed tomographic imaging, we suggest that assigning a morphometric age to potential liver transplant recipients could improve prediction of postoperative mortality risk. Quantification of the eyeball test seems possible in the era of extensive cross-sectional imaging. One day, cross-sectional images may become part of the standard risk assessment.
Accepted for Publication: September 18, 2013.
Corresponding Author: Michael J. Englesbe, MD, Morphomics Analysis Group, Department of Surgery, University of Michigan Medical School, 2926A Taubman Center, 1500 E Medical Center Dr, Ann Arbor, MI 48109 (email@example.com).
Published Online: February 5, 2014. doi:10.1001/jamasurg.2013.4823.
Author Contributions: Drs Waits and Englesbe had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Study concept and design: Waits, Harbaugh, Sonnenday, Sullivan, Wang, Englesbe.
Acquisition of data: Waits, Kim, Terjimanian, Tishberg, Sullivan, Wang.
Analysis and interpretation of data: Waits, Terjimanian, Tishberg, Sheetz, Sonnenday, Wang, Englesbe.
Drafting of the manuscript: Waits, Kim, Terjimanian, Tishberg, Sheetz, Englesbe.
Critical revision of the manuscript for important intellectual content: Waits, Terjimanian, Harbaugh, Sonnenday, Sullivan, Wang, Englesbe.
Statistical analysis: Waits, Terjimanian, Sheetz.
Obtained funding: Sonnenday, Wang, Englesbe.
Administrative, technical, or material support: Waits, Kim, Tishberg, Sonnenday, Sullivan, Wang, Englesbe.
Study supervision: Waits, Terjimanian, Sonnenday, Englesbe.
Conflict of Interest Disclosures: None reported.
Funding/Support: This study was supported by grant K08 DK0827508 from the National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, and by Blue Cross/Blue Shield of Michigan Foundation (Dr Englesbe).
Role of the Sponsor: The funding source had no role in the design and conduct of the study; collection, management, analysis, or interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.
et al. Sarcopenia and mortality after liver transplantation. J Am Coll Surg
. 2010;211(2):271-278.PubMedGoogle ScholarCrossref
et al. Analytic morphomics, core muscle size, and surgical outcomes. Ann Surg
. 2012;256(2):255-261.PubMedGoogle ScholarCrossref
et al; Michigan Analytic Morphomics Group (MAMG). Frailty, core muscle size, and mortality in patients undergoing open abdominal aortic aneurysm repair. J Vasc Surg
. 2011;53(4):912-917.PubMedGoogle ScholarCrossref
NG. Effects of age and nutritional status on surgical outcomes in head and neck cancer. Ann Surg
. 1988;207(3):267-273.PubMedGoogle ScholarCrossref
et al. Survival benefit-based deceased-donor liver allocation. Am J Transplant
. 2009;9(4, pt 2):970-981.PubMedGoogle ScholarCrossref
DE. Evidence-based development of liver allocation: a review. Transplant Int
. 2011;24(10):965-972.PubMedGoogle ScholarCrossref
DR. Regression models and life-tables. J R Stat Soc B
. 1972;34(2):187-222.Google Scholar
et al. Decreased core muscle size is associated with worse patient survival following esophagectomy for cancer. Dis Esophagus
. 2013;26(7):716-722.PubMedGoogle Scholar
PD, van Vledder
et al. Sarcopenia negatively impacts short-term outcomes in patients undergoing hepatic resection for colorectal liver metastasis. HPB (Oxford)
. 2011;13(7):439-446.PubMedGoogle ScholarCrossref
et al. Impact of sarcopenia on outcomes following resection of pancreatic adenocarcinoma. J Gastrointest Surg
. 2012;16(8):1478-1486.PubMedGoogle ScholarCrossref
VE. Sarcopenia is associated with postoperative infection and delayed recovery from colorectal cancer resection surgery. Br J Cancer
. 2012;107(6):931-936.PubMedGoogle ScholarCrossref
et al. Abdominal aortic calcification and surgical outcomes in patients with no known cardiovascular risk factors. Ann Surg
. 2012;257(4):774-781.PubMedGoogle ScholarCrossref