Evaluation of Access to Hospitals Most Ready to Achieve National Accreditation for Rectal Cancer Treatment | Colorectal Cancer | JAMA Surgery | JAMA Network
[Skip to Content]
[Skip to Content Landing]
Figure 1.  Hospital Classification Schema
Hospital Classification Schema

Volume and process standard thresholds for hospital categorization.

Figure 2.  Distance Traveled by Hospital Group
Distance Traveled by Hospital Group

Distance traveled by patients to high-volume/high-adherence (A), high-volume/low-adherence (B), low-volume/high-adherence (C), and low-volume/low-adherence hospitals.

Table 1.  Hospitals Meeting Threshold for National Accreditation Program for Rectal Cancer Adherence by Process Standard
Hospitals Meeting Threshold for National Accreditation Program for Rectal Cancer Adherence by Process Standard
Table 2.  Hospital Characteristics by Hospital Group
Hospital Characteristics by Hospital Group
Table 3.  Patient Characteristics by Hospital Group
Patient Characteristics by Hospital Group
1.
Monson  JR, Probst  CP, Wexner  SD,  et al; Consortium for Optimizing the Treatment of Rectal Cancer (OSTRiCh).  Failure of evidence-based cancer care in the United States: the association between rectal cancer treatment, cancer center volume, and geography.  Ann Surg. 2014;260(4):625-631. doi:10.1097/SLA.0000000000000928PubMedGoogle ScholarCrossref
2.
American College of Surgeons. National Accreditation Program for Rectal Cancer. What is the NAPRC? https://www.facs.org/quality-programs/cancer/naprc. Accessed February 1, 2018.
3.
Telem  DA, Talamini  M, Altieri  M, Yang  J, Zhang  Q, Pryor  AD.  The effect of national hospital accreditation in bariatric surgery on perioperative outcomes and long-term mortality.  Surg Obes Relat Dis. 2015;11(4):749-757. doi:10.1016/j.soard.2014.05.012PubMedGoogle ScholarCrossref
4.
Nicholas  LH, Dimick  JB.  Bariatric surgery in minority patients before and after implementation of a centers of excellence program.  JAMA. 2013;310(13):1399-1400. doi:10.1001/jama.2013.277915PubMedGoogle ScholarCrossref
5.
Brady  JT, Xu  Z, Scarberry  KB,  et al; Consortium for Optimizing the Treatment of Rectal Cancer (OSTRiCh).  Evaluating the current status of rectal cancer care in the US: where we stand at the start of the Commission on Cancer’s National Accreditation Program for Rectal Cancer.  J Am Coll Surg. 2018;226(5):881-890. doi:10.1016/j.jamcollsurg.2018.01.057PubMedGoogle ScholarCrossref
6.
Gabriel  E, Thirunavukarasu  P, Al-Sukhni  E, Attwood  K, Nurkin  SJ.  National disparities in minimally invasive surgery for rectal cancer.  Surg Endosc. 2016;30(3):1060-1067. doi:10.1007/s00464-015-4296-5PubMedGoogle ScholarCrossref
7.
Joseph  DA, Johnson  CJ, White  A, Wu  M, Coleman  MP.  Rectal cancer survival in the United States by race and stage, 2001 to 2009: findings from the CONCORD-2 study.  Cancer. 2017;123(suppl 24):5037-5058. doi:10.1002/cncr.30882PubMedGoogle ScholarCrossref
8.
Guillem  JG, Díaz-González  JA, Minsky  BD,  et al.  cT3N0 rectal cancer: potential overtreatment with preoperative chemoradiotherapy is warranted.  J Clin Oncol. 2008;26(3):368-373. doi:10.1200/JCO.2007.13.5434PubMedGoogle ScholarCrossref
9.
Bilimoria  KY, Stewart  AK, Winchester  DP, Ko  CY.  The National Cancer Data Base: a powerful initiative to improve cancer care in the United States.  Ann Surg Oncol. 2008;15(3):683-690. doi:10.1245/s10434-007-9747-3PubMedGoogle ScholarCrossref
10.
 Commission on Cancer Quality Measures. Chicago, IL: American College of Surgeons; 2016.
11.
American College of Surgeons Commission on Cancer.  The National Accreditation Program for Rectal Cancer Standards Manual. American College of Surgeons Commission on Cancer. Chicago, IL: American College of Surgeons; 2017.
12.
Lee  L, Dietz  DW, Fleming  FJ,  et al.  Accreditation readiness in US multidisciplinary rectal cancer care: a survey of OSTRICH member institutions.  JAMA Surg. 2018;153(4):388-390. PubMedGoogle Scholar
13.
Xu  Z, Becerra  AZ, Justiniano  CF,  et al.  Is the distance worth it? patients with rectal cancer traveling to high-volume centers experience improved outcomes.  Dis Colon Rectum. 2017;60(12):1250-1259. doi:10.1097/DCR.0000000000000924PubMedGoogle ScholarCrossref
14.
Archampong  D, Borowski  D, Wille-Jørgensen  P, Iversen  LH.  Workload and surgeon’s specialty for outcome after colorectal cancer surgery.  Cochrane Database Syst Rev. 2012;(3):CD005391.PubMedGoogle Scholar
15.
American College of Surgeons. Charlson-Deyo score. http://ncdbpuf.facs.org/content/charlsondeyo-comorbidity-index. Published 2016. Accessed January 10, 2019.
16.
Dimick  J, Ruhter  J, Sarrazin  MV, Birkmeyer  JD.  Black patients more likely than whites to undergo surgery at low-quality hospitals in segregated regions.  Health Aff (Millwood). 2013;32(6):1046-1053. doi:10.1377/hlthaff.2011.1365PubMedGoogle ScholarCrossref
17.
Livingston  EH, Burchell  I.  Reduced access to care resulting from centers of excellence initiatives in bariatric surgery.  Arch Surg. 2010;145(10):993-997. doi:10.1001/archsurg.2010.218PubMedGoogle ScholarCrossref
18.
Donabedian  A.  The quality of care. how can it be assessed?  JAMA. 1988;260(12):1743-1748. doi:10.1001/jama.1988.03410120089033PubMedGoogle ScholarCrossref
19.
Huang  LC, Tran  TB, Ma  Y, Ngo  JV, Rhoads  KF.  Factors that influence minority use of high-volume hospitals for colorectal cancer care.  Dis Colon Rectum. 2015;58(5):526-532. doi:10.1097/DCR.0000000000000353PubMedGoogle ScholarCrossref
20.
Wong  RS, Vikram  B, Govern  FS,  et al.  National Cancer Institute’s Cancer Disparities Research Partnership Program: experience and lessons learned.  Front Oncol. 2014;4:303. doi:10.3389/fonc.2014.00303PubMedGoogle ScholarCrossref
21.
Luckenbaugh  AN, Miller  DC, Ghani  KR.  Collaborative quality improvement.  Curr Opin Urol. 2017;27(4):395-401. doi:10.1097/MOU.0000000000000404PubMedGoogle ScholarCrossref
22.
Haas  S, Gawande  A, Reynolds  ME.  The risks to patient safety from health system expansions.  JAMA. 2018;319(17):1765-1766. doi:10.1001/jama.2018.2074PubMedGoogle ScholarCrossref
23.
Hollenbeck  BK, Daignault  S, Dunn  RL, Gilbert  S, Weizer  AZ, Miller  DC.  Getting under the hood of the volume-outcome relationship for radical cystectomy.  J Urol. 2007;177(6):2095-2099. doi:10.1016/j.juro.2007.01.153PubMedGoogle ScholarCrossref
24.
Kowalski  C, Lee  SY, Ansmann  L, Wesselmann  S, Pfaff  H.  Meeting patients’ health information needs in breast cancer center hospitals—a multilevel analysis.  BMC Health Serv Res. 2014;14:601. doi:10.1186/s12913-014-0601-6PubMedGoogle ScholarCrossref
Limit 200 characters
Limit 25 characters
Conflicts of Interest Disclosure

Identify all potential conflicts of interest that might be relevant to your comment.

Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.

Err on the side of full disclosure.

If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.

Not all submitted comments are published. Please see our commenting policy for details.

Limit 140 characters
Limit 3600 characters or approximately 600 words
    Original Investigation
    February 20, 2019

    Evaluation of Access to Hospitals Most Ready to Achieve National Accreditation for Rectal Cancer Treatment

    Author Affiliations
    • 1University of Michigan Medical School, Ann Arbor
    • 2Center for Healthcare Outcomes and Policy, University of Michigan, Ann Arbor
    • 3Department of Surgery, University of Michigan, Ann Arbor
    JAMA Surg. 2019;154(6):516-523. doi:10.1001/jamasurg.2018.5521
    Key Points

    Question  How do outcomes of hospitals eligible for the American College of Surgeons National Accreditation Program for Rectal Cancer compare with those of hospitals less likely to be accredited?

    Findings  This cohort study of 1315 American College of Surgeons Commission on Cancer–accredited hospitals found that those most prepared for accreditation are usually academic institutions with the best survival outcomes. These hospitals more often serve affluent populations.

    Meaning  The current standards and scope of the National Accreditation Program for Rectal Cancer may not reach hospitals and patients most in need of improvement and could exacerbate disparities in access to high-quality care, which may be mitigated by quality improvement interventions and redirection of socioeconomically disadvantaged patients to high-quality accredited institutions.

    Abstract

    Importance  The American College of Surgeons National Accreditation Program for Rectal Cancer (NAPRC) promotes multidisciplinary care to improve oncologic outcomes in rectal cancer. However, accreditation requirements may be difficult to achieve for the lowest-performing institutions. Thus, it is unknown whether the NAPRC will motivate care improvement in these settings or widen disparities.

    Objectives  To characterize hospitals’ readiness for accreditation and identify differences in the patients cared for in hospitals most and least prepared for accreditation.

    Design, Setting, and Participants  A total of 1315 American College of Surgeons Commission on Cancer–accredited hospitals in the National Cancer Database from January 1, 2011, to December 31, 2015, were sorted into 4 cohorts, organized by high vs low volume and adherence to process standards, and patient and hospital characteristics and oncologic outcomes were compared. The patients included those who underwent surgical resection with curative intent for rectal adenocarcinoma, mucinous adenocarcinoma, or signet ring cell carcinoma. Data analysis was performed from November 2017 to January 2018.

    Exposures  Hospitals’ readiness for accreditation, as determined by their annual resection volume and adherence to 5 available NAPRC process standards.

    Main Outcomes and Measures  Hospital characteristics, patient sociodemographic characteristics, and 5-year survival by hospital.

    Results  Among the 1315 included hospitals, 38 (2.9%) met proposed thresholds for all 5 NAPRC process standards and 220 (16.7%) met the threshold on 4 standards. High-volume hospitals (≥20 resections per year) tended to be academic institutions (67 of 104 [64.4%] vs 159 of 1211 [13.1%]; P = .001), whereas low-volume hospitals (<20 resections per year) tended to be comprehensive community cancer programs (530 of 1211 [43.8%] vs 28 of 104 [26.9%]; P = .001). Patients in low-volume hospitals were more likely to be older (11 429 of 28 076 [40.7%] vs 4339 of 12 148 [35.7%]; P < .001) and have public insurance (13 054 of 28 076 [46.5%] vs 4905 of 12 148 [40.4%]; P < .001). Low-adherence hospitals were more likely to care for black and Hispanic patients (1980 of 19 577 [17.2%] vs 3554 of 20 647 [10.1%]; P < .001). On multivariable Cox proportional hazards model regression, high-volume hospitals had better 5-year survival outcomes than low-volume hospitals (hazard ratio, 0.99; 95% CI, 0.99-1.00; P < .001), but there was no significant survival difference by hospital process standard adherence.

    Conclusions and Relevance  Hospitals least likely to receive NAPRC accreditation tended to be community institutions with worse survival outcomes, serving patients at a lower socioeconomic position. To possibly avoid exacerbating disparities in access to high-quality rectal cancer care, the NAPRC study findings suggest enabling access for patients with socioeconomic disadvantage or engaging in quality improvement for hospitals not yet achieving accreditation benchmarks.

    Introduction

    Despite well-established evidence-based guidelines, marked shortcomings remain in the quality of rectal cancer care in the United States.1 Aiming to reduce unwanted variation in care practices and improve multidisciplinary engagement in rectal cancer care, the American College of Surgeons Commission on Cancer has begun implementation of a National Accreditation Program for Rectal Cancer (NAPRC).2 Like other nationally endorsed accreditation programs, such as the National Cancer Institute cancer center designation or bariatric centers of excellence, the NAPRC intends to improve outcomes by certifying process standards in their member institutions. Some of these accreditation programs have faced controversy around their uncertain effect on access to care.3 In the case of the bariatric accreditation program, efforts to improve patient safety resulted in decreased access to bariatric surgery for nonwhite Medicare beneficiaries.4

    The NAPRC aims to improve the quality of rectal cancer care on a national scale, but it is not yet clear how accreditation might affect patients’ access to high-quality care in the hospitals they are most likely to use. Furthermore, nationwide data suggest that only slightly more than half (56.3%) of patients with rectal cancer in the United States currently receive guideline-concordant care at the adherence levels specified by the NAPRC.5 Still, the characteristics of hospitals capable of achieving such levels of adherence remain unknown, and so also remains the effect of the NAPRC on patients’ access to quality care. The availability of NAPRC designation could motivate improvement in the delivery of guideline-concordant care across the spectrum of institutions. However, if accreditation is achieved primarily in high-volume specialty institutions already providing high-quality care, the quality of care at unaccredited hospitals may stagnate or worsen. Recognizing that the quality of rectal cancer care is associated with patients’ socioeconomic position,6,7 the NAPRC could have the unintended consequence of widening disparities and limiting access to high-quality care for certain patient populations if, as with the bariatric accreditation program, it favors already high-performing institutions.

    In this study, we first modeled each hospital’s readiness for accreditation, according to validated measures of quality, including surgical volume and adherence to the NAPRC-recommended rectal cancer process measures that can be ascertained in National Cancer Database (NCDB) data. We then compared patient characteristics, hospital characteristics, and outcomes between hospitals, based on their procedure volume and process adherence. We hypothesized that hospitals most ready for accreditation will tend to serve higher-resourced patient populations, and that patients with socioeconomic disadvantage may have lesser access to these institutions. Understanding of this potential source of increased disparity may enable the design and dissemination of the NAPRC to prevent unintended consequences.

    Methods
    Inclusion Criteria

    This was a retrospective cohort analysis in which we queried the NCDB Participant Use File from January 1, 2011, to December 31, 2015. Data analysis was conducted from November 2017 to January 2018. The NCDB Participant Use File captures patient data from Commission on Cancer–accredited hospitals,8 which account for approximately 70% of patients with cancer in the United States.9 This date range was selected to most accurately represent hospitals’ current readiness for accreditation. This study was deemed exempt from human subjects review by the institutional review board of the University of Michigan; data were deidentified.

    Analysis was limited to patients who underwent surgical resection with curative intent for adenocarcinoma, mucinous adenocarcinoma, or signet ring cell carcinoma of the rectum. Patients with missing data regarding chemotherapy or radiotherapy were excluded (n = 162), as were patients with incomplete clinical staging information (n = 9287), as the hospital’s adherence in care provided could not be established.

    Modeling Readiness for Accreditation

    We categorized hospitals according to both annual procedure volume and adherence to NAPRC-defined process measures. Although many of the NAPRC measures are not captured in available data registries, 3 pathologic measures (circumferential radial margin, proximal and distal margins, and tumor regression), as well as clinical staging, timing of definitive treatment, and carcinoembryonic antigen level, are available in the NCDB Participant Use File.5 In addition, the NCDB captures the Commission on Cancer standard for treatment of rectal cancer, which requires delivery of neoadjuvant chemoradiotherapy for clinical stage II or III disease or adjuvant chemoradiotherapy administered within 180 days postoperatively for pathologic stages II and III disease, for a total of 5 process standards captured in this database.10

    In accordance with the NAPRC’s requirements for accreditation and accreditation with contingency, we defined high adherence to be performance above the mandated threshold for at least 3 of the 5 measurable process standards (Table 1).11 Hospitals meeting 2 or fewer standards were designated as low-adherence institutions. These cutoffs were chosen to match those used in previous studies.5,12 These cutoffs also closely reflect the NAPRC accreditation standards. Programs may be accredited with contingency if they are found to be deficient in up to 5 of the 22 standards put forth in the National Accreditation Program for Rectal Cancer Standards Manual.11

    We computed mean hospital procedure volume as a structural measure of rectal cancer care quality, as it is well established that higher case volume is associated with superior outcomes.1,13,14 Hospitals with a mean of 20 or more surgical rectal cancer cases annually were considered high volume, and hospitals with fewer than 20 cases were categorized as low volume. This cutoff was chosen to be consistent with previous studies and reflects the observed distribution of case volume between institutions; this level also allowed for similar-size comparison groups.12

    Using the high and low adherence and volume assignments, hospitals were categorized into 4 groups: high volume/high adherence, high volume/low adherence, low volume/high adherence, and low volume/low adherence (Figure 1).

    Statistical Analysis

    Descriptive statistics were used to analyze adherence to the measurable NAPRC process standards across all hospitals. We then analyzed the characteristics of the hospitals and the patients they serve between the 4 hospital groups, using Fisher exact and χ2 tests for categorical variables, t test for continuous variables, and analysis of variance for multicategory comparisons of continuous data. We compared overall 5-year survival between groups of hospitals using multivariable Cox proportional hazards regression models, adjusting for a priori clinically relevant patient factors, including age, sex, race/ethnicity, and rectal cancer stage. Observations were censored according to NCDB data after the date of last contact. We used clustered SEs to account for clustering of outcomes within hospitals and considered a 2-tailed, unpaired P value <.05 to be significant. All statistical analyses were conducted using Stata, version 14 (StataCorp LP).

    Results
    Hospital Accreditation Readiness Classification

    We identified 1315 hospitals that performed a total of 40 224 rectal cancer resections meeting inclusion criteria. Within this group of hospitals, 38 (2.9%) met the thresholds for adherence to all 5 NAPRC measures, 220 (16.7%) met 4 measures, and 431 (32.8%) met 3 measures. The measures are reported by volume of cases per year in Table 1. The mean (SD) number of process measures observed across institutions was 2.6 (1.1), and the median was 3 (interquartile range, 5). Pathologic testing was the most commonly deficient process measure, and within that composite measure, the tumor regression measurement was most often missing.

    The mean (SD) rectal cancer surgical volume across all hospitals was 7.5 (8.0) cases per year. Of the 1211 (92.1%) low-volume hospitals (<20 cases per year), 644 (53.2%) were considered high quality, with above-threshold adherence to 3 or more of the 5 NAPRC measures. Of the 104 (8%) high-volume hospitals (≥20 or more cases per year), 45 (43.3%) were classified as high adherence.

    Forty-five hospitals (3.4%) met criteria for designation as a high-volume/high-adherence institution, with a mean (SD) annual case volume of 27.9 (14.1). These hospitals met a mean (SD) of 3.3 (0.5) of 5 total process standards. In the high-volume/low-adherence group, there were 59 hospitals (4.5%) with a mean (SD) annual case volume of 28.2 (7.9) cases that met a mean (SD) of 1.7 (0.5) process standards. In the low-volume/high-adherence group, there were 644 hospitals (49.0%) with a mean (SD) annual case volume of 5.3 (4.4) cases that met 3.4 (0.6) process standards. In addition, there were 567 hospitals (43.1%) in the low-volume/low-adherence group, with a mean (SD) annual case volume of 6.2 (4.5) cases that met 1.6 (0.6) process standards (Figure 1).

    Hospital Characteristics by Accreditation Readiness

    High-volume hospitals were more likely than low-volume hospitals to be academic institutions (67 of 104 [64.4%] vs 159 of 1211 [13.1%]; P = .001), whereas low-volume hospitals tended to be comprehensive community cancer programs (530 of 1211 [43.8%] vs 28 of 104 [26.9%]; P = .001) (Table 2). Most of the 1315 hospitals were located in the South Atlantic (Washington, DC; Delaware; Florida; Georgia; Maryland; North Carolina; South Carolina; Virginia; West Virginia) (272 [20.7%]) and East North Central (Illinois, Indiana, Michigan, Ohio, Wisconsin) (265 [20.2%]) regions. There was a slight trend toward low-adherence hospitals clustering in southern regions.

    Patient Characteristics by Accreditation Readiness

    Low-volume hospitals served a higher proportion of older patients (11 429 of 28 076 [40.7%] vs of 4339 of 12 148 [35.7%]; P < .001) and patients with public insurance (Medicaid and Medicare, 13 054 [46.5%] vs 4905 [40.4%]; P < .001). High-volume hospitals were more likely to serve patients with higher levels of education (4551 of 12 148 [37.5%] vs 8915 of 28 076 [31.8%]; P < .001) and private insurance (6438 [53.0%] vs 12 964 [46.2%]; P < .001). Patients with stage III or IV disease were more likely to be seen at high-volume hospitals (54.7% vs 48.3%; P < .001) (Table 3).

    Low-adherence hospitals were more likely to care for black and Hispanic patients (3554 [17.2%] vs 1980 [10.1%]; P < .001). Low-adherence hospitals were also most likely to serve patients from large cities (16 786 [81.3%] vs 14 241 [72.7%]; P < .001). There was no distinct pattern in patients’ comorbidity scores or income level between hospital groups. Finally, patients traveled nearly twice as far to be seen at high-volume hospitals compared with patients who traveled to low-volume hospitals (mean distance, 85.0 vs 39.5 km; P < .001) (Figure 2). Exclusions owing to missing data were somewhat less common in the high-volume/high-adherence hospitals (high-volume/high-adherence, 857 [13.3%]; high-volume/low-adherence, 1611 [19.3%]; low-volume/high-adherence, 3019 [17.6%]; low-volume/low-adherence, 3707 [21.0%]).

    Overall Survival by Accreditation Readiness

    Patients at hospitals with high volume/high adherence survived the longest and served as the reference group for the analysis. There was no significant difference in overall 5-year survival compared with patients in high-volume/low-adherence hospitals, with an HR of 1.02 (95% CI, 0.922-1.12; P = .42). Low-volume/high-adherence hospitals demonstrated an HR of 1.18 (95% CI, 1.08-1.28; P = .001). Low-volume/low-adherence hospitals demonstrated an HR of 1.21 (95% CI, 1.11-1.31; P < .001). These HRs reflect the 5-year mortality risk; they are reported for each group compared with the reference group and were adjusted for age, sex, race/ethnicity, Charlson-Deyo comorbidity score, and rectal cancer stage (eFigure and eTable in the Supplement). The Charlson-Deyo score indicates the number of comorbid conditions that a patient has, using only those found in the Charlson Comorbidity Score Mapping Table.15 Regarding relative survival between the other 3 groups of hospitals, compared with the high-volume/low-adherence hospitals, the low-volume/high-adherence hospitals demonstrated an HR of 1.15 (95% CI, 1.04-1.28; P = .006), and the low-volume/low-adherence hospitals demonstrated an HR of 1.19 (95% CI, 1.07-1.32; P = .001). The HR for the low-volume/low-adherence hospitals compared with the low-volume/high-adherence hospitals was 1.03 (95% CI, 0.97-1.1; P = .36). On multivariable Cox proportional hazards model regression, high-volume hospitals had better 5-year survival outcomes than low-volume hospitals (HR, 0.99; 95% CI, 0.99-1.00; P < .001), but there was no significant survival difference by hospital process standard adherence.

    Discussion

    This study has 3 key findings. First, the hospitals most prepared for accreditation are a small group of predominantly academic centers that serve a highly resourced patient population. Second, most patients with rectal cancer are cared for at low-volume or low-adherence hospitals, which are most often comprehensive community cancer centers serving patients with fewer socioeconomic resources. Patients tend to travel shorter distances to receive care at these hospitals that are less likely to be prepared for NAPRC accreditation. Third, mean 5-year survival is lowest among patients in low-volume/low-adherence hospitals, which are least likely to receive accreditation.

    Previous studies have found that many patients who undergo treatment for rectal cancer in the United States do not receive guideline-concordant care.5 Accordingly, in this study we found that a minority of hospitals are currently well positioned to achieve the requirements for NAPRC accreditation. In a recent study that queried hospitals’ readiness for accreditation, their self-reported rates of adherence were consistent with our findings.11 Low-volume/low-adherence hospitals are unlikely to be accredited, and they have significantly worse survival compared with other hospital groups. However, they perform half of all rectal cancer operations and, as found in other analyses, serve a larger proportion of older, Medicare and/or Medicaid, and black and Hispanic patients, and patients who do not travel far for care.13,16 Whether the NAPRC can successfully improve the quality of care in these settings or induce selective referral to accredited institutions remains unknown.

    It is essential that we understand the possible mechanisms for care improvement in the setting of the NAPRC because of the considerable gap identified between the highest- and lowest-performing hospitals. Without efforts to improve access to high-quality care for patients who receive treatment at the lowest-performing hospitals, an accreditation program for high-performing institutions could easily exclude the places most in need of improvement. Failing to include these lower-volume, lower-adherence hospitals in the NAPRC will leave some of the most vulnerable patients with access to inferior care.17

    The findings of this study suggest 2 different strategies by which the NAPRC might successfully enable reductions in rectal cancer care disparities. Both structural determinants, such as case volume, and process measures, such as the NAPRC standards, contribute to the quality of care that a hospital can provide.18 Hospitals in the high-volume/low-adherence group, as well as the low-volume/high-adherence group, are compelling targets for focused quality improvement efforts. The first strategy is geared toward the high-volume centers. We believe that the current NAPRC standards are suited for improving care at hospitals with high-volume but low-guideline adherence. Promoting improved adherence to process standards via, for example, training sessions tailored to the departments required to meet certain guidelines, could facilitate accreditation for these hospitals. Similar survival outcomes between this hospital group and the high-volume and high-adherence group provide further rationale for the NAPRC to promote process improvement and allow these institutions to earn accreditation.

    The second strategy for broader inclusion targets low-volume hospitals that already achieve high-level adherence to the measured process standards. This strategy is more challenging, but would likely make a greater contribution to expanding access to high-quality care, as the patients who receive treatment at these hospitals share sociodemographic characteristics with the underserved populations that seek care at low-volume/low-adherence hospitals. In addition to serving a high proportion of low-income, publicly insured, and racial minority patients, this hospital group provides care for the largest share of patients from rural areas. It is encouraging that low-volume/high-adherence institutions already deliver care concordant with multiple process standards despite their lower rectal cancer case volume, but this is tempered by their patients’ survival outcomes, which are shorter than those of the high-volume hospitals. The NAPRC might consider a strategy in which it encourages selective referral of clinically complex patients with rectal cancer receiving care at low-volume/low-adherence hospitals to the low-volume/high-adherence hospitals.19 In addition, partnering these low-volume hospitals with academic centers to coordinate care plans, for example, via regional collaboratives, could further encourage high-quality care.20,21

    Although there has been reasonable concern about the effect of expanding systems of care, proceeding deliberately with proper oversight for patient safety can mitigate these risks.22 Admittedly, these considerations are ambitious and may require resources and influences beyond those available within the proposed accreditation program.

    Limitations

    There are limitations to this study. First, owing to the observational retrospective nature of the NCDB, these findings may be biased by unmeasured confounding variables. However, this analysis is bolstered by the fact that it is conducted at the hospital level, which lessens the effect of differences between individual patients by pooling data into larger cohorts. Second, our process standards are an imperfect surrogate for the actual NAPRC standards. These 5 standards do not capture all relevant hospital quality attributes; however, they are the best available means of describing program quality from a large national database and have been used in studies authored by the architects of NAPRC to assess national trends in quality adherence in rectal cancer care.5 A more granular study of particular process measures and their effect on patient outcomes will be needed after the NAPRC is implemented. Third, the NCDB is not a comprehensive national database, as it includes only Commission on Cancer–accredited institutions.8 However, it is anticipated that nonaccredited hospitals would have even lower surgical volumes and lower rates of adherence to guidelines, and thus their exclusion is likely conservative. Although the findings can be generalized only to Commission on Cancer–accredited hospitals, NAPRC accreditation requires Commission on Cancer accreditation, and thus NCDB data cover the full set of hospitals relevant to these questions. Furthermore, we excluded patients with missing clinical staging, as their care could not be assessed for adherence to key measures, and exclusions were least common in the high-volume/high-adherence hospitals. Any bias from this difference would again be conservative, however, as their inclusion would exaggerate differences between groups.

    In addition, the survival analysis demonstrated that volume was a stronger predictor of survival than process standard adherence. Volume is not included as an accreditation standard in the NAPRC but may be associated with a hospital’s ability to provide complex multidisciplinary care.23,24 The NAPRC has designated multiple clinical protocols beyond just the process standards (eg, multidisciplinary tumor boards and quality-reporting systems) as criteria for accreditation. Our model included volume as a proxy for these unmeasured structural standards because they will be most readily achieved in specialty institutions with established, high-volume rectal cancer practices. Our findings remain relevant to the NAPRC and potential efforts to increase the inclusivity of the accreditation program.

    Conclusions

    If the NAPRC aims to make substantive progress in bettering rectal cancer care in the United States, preserving access via broader accreditation appears to be necessary. Specifically, low-volume/low-adherence hospitals could selectively refer patients with more complex conditions to low-volume/high-adherence hospitals. In addition, it appears that high-volume hospitals should renew their commitment to process standards, and those with lower rates of adherence to these measures should be supported in improving the delivery of high-quality care to achieve accreditation. In this way, the NAPRC could improve access to high-quality rectal cancer care while maintaining a commitment to excellence.

    The NAPRC seems to need to hold institutions to a high standard; however, it does not appear that this goal should be at the expense of preserving, and even expanding, access to high-quality care for socioeconomically disadvantaged patients. Effort to improve the care of underserved populations will probably yield the largest improvements in survival and quality of life for patients with rectal cancer.

    Back to top
    Article Information

    Accepted for Publication: November 10, 2018.

    Corresponding Author: Scott E. Regenbogen, MD, MPH, University of Michigan, 1500 E Medical Center Dr, Taubman Center 2924F, Ann Arbor, MI 48109 (sregenbo@med.umich.edu).

    Published Online: February 20, 2019. doi:10.1001/jamasurg.2018.5521

    Author Contributions: Dr Regenbogen had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

    Concept and design: All authors.

    Acquisition, analysis, or interpretation of data: All authors.

    Drafting of the manuscript: Antunez, Kanters.

    Critical revision of the manuscript for important intellectual content: All authors.

    Statistical analysis: All authors.

    Supervision: Kanters, Regenbogen.

    Conflict of Interest Disclosures: None reported.

    Funding/Support: Ms Antunez was supported by National Institutes of Health (NIH) grant 1TL1TR002242-01. Dr Kanters was supported by NIH grant T32 HS000053-24. Dr Regenbogen was supported by a National Institute on Aging Mentored Career Development Award, K08-AG047252.

    Role of the Funder/Sponsor: The funding sources had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

    References
    1.
    Monson  JR, Probst  CP, Wexner  SD,  et al; Consortium for Optimizing the Treatment of Rectal Cancer (OSTRiCh).  Failure of evidence-based cancer care in the United States: the association between rectal cancer treatment, cancer center volume, and geography.  Ann Surg. 2014;260(4):625-631. doi:10.1097/SLA.0000000000000928PubMedGoogle ScholarCrossref
    2.
    American College of Surgeons. National Accreditation Program for Rectal Cancer. What is the NAPRC? https://www.facs.org/quality-programs/cancer/naprc. Accessed February 1, 2018.
    3.
    Telem  DA, Talamini  M, Altieri  M, Yang  J, Zhang  Q, Pryor  AD.  The effect of national hospital accreditation in bariatric surgery on perioperative outcomes and long-term mortality.  Surg Obes Relat Dis. 2015;11(4):749-757. doi:10.1016/j.soard.2014.05.012PubMedGoogle ScholarCrossref
    4.
    Nicholas  LH, Dimick  JB.  Bariatric surgery in minority patients before and after implementation of a centers of excellence program.  JAMA. 2013;310(13):1399-1400. doi:10.1001/jama.2013.277915PubMedGoogle ScholarCrossref
    5.
    Brady  JT, Xu  Z, Scarberry  KB,  et al; Consortium for Optimizing the Treatment of Rectal Cancer (OSTRiCh).  Evaluating the current status of rectal cancer care in the US: where we stand at the start of the Commission on Cancer’s National Accreditation Program for Rectal Cancer.  J Am Coll Surg. 2018;226(5):881-890. doi:10.1016/j.jamcollsurg.2018.01.057PubMedGoogle ScholarCrossref
    6.
    Gabriel  E, Thirunavukarasu  P, Al-Sukhni  E, Attwood  K, Nurkin  SJ.  National disparities in minimally invasive surgery for rectal cancer.  Surg Endosc. 2016;30(3):1060-1067. doi:10.1007/s00464-015-4296-5PubMedGoogle ScholarCrossref
    7.
    Joseph  DA, Johnson  CJ, White  A, Wu  M, Coleman  MP.  Rectal cancer survival in the United States by race and stage, 2001 to 2009: findings from the CONCORD-2 study.  Cancer. 2017;123(suppl 24):5037-5058. doi:10.1002/cncr.30882PubMedGoogle ScholarCrossref
    8.
    Guillem  JG, Díaz-González  JA, Minsky  BD,  et al.  cT3N0 rectal cancer: potential overtreatment with preoperative chemoradiotherapy is warranted.  J Clin Oncol. 2008;26(3):368-373. doi:10.1200/JCO.2007.13.5434PubMedGoogle ScholarCrossref
    9.
    Bilimoria  KY, Stewart  AK, Winchester  DP, Ko  CY.  The National Cancer Data Base: a powerful initiative to improve cancer care in the United States.  Ann Surg Oncol. 2008;15(3):683-690. doi:10.1245/s10434-007-9747-3PubMedGoogle ScholarCrossref
    10.
     Commission on Cancer Quality Measures. Chicago, IL: American College of Surgeons; 2016.
    11.
    American College of Surgeons Commission on Cancer.  The National Accreditation Program for Rectal Cancer Standards Manual. American College of Surgeons Commission on Cancer. Chicago, IL: American College of Surgeons; 2017.
    12.
    Lee  L, Dietz  DW, Fleming  FJ,  et al.  Accreditation readiness in US multidisciplinary rectal cancer care: a survey of OSTRICH member institutions.  JAMA Surg. 2018;153(4):388-390. PubMedGoogle Scholar
    13.
    Xu  Z, Becerra  AZ, Justiniano  CF,  et al.  Is the distance worth it? patients with rectal cancer traveling to high-volume centers experience improved outcomes.  Dis Colon Rectum. 2017;60(12):1250-1259. doi:10.1097/DCR.0000000000000924PubMedGoogle ScholarCrossref
    14.
    Archampong  D, Borowski  D, Wille-Jørgensen  P, Iversen  LH.  Workload and surgeon’s specialty for outcome after colorectal cancer surgery.  Cochrane Database Syst Rev. 2012;(3):CD005391.PubMedGoogle Scholar
    15.
    American College of Surgeons. Charlson-Deyo score. http://ncdbpuf.facs.org/content/charlsondeyo-comorbidity-index. Published 2016. Accessed January 10, 2019.
    16.
    Dimick  J, Ruhter  J, Sarrazin  MV, Birkmeyer  JD.  Black patients more likely than whites to undergo surgery at low-quality hospitals in segregated regions.  Health Aff (Millwood). 2013;32(6):1046-1053. doi:10.1377/hlthaff.2011.1365PubMedGoogle ScholarCrossref
    17.
    Livingston  EH, Burchell  I.  Reduced access to care resulting from centers of excellence initiatives in bariatric surgery.  Arch Surg. 2010;145(10):993-997. doi:10.1001/archsurg.2010.218PubMedGoogle ScholarCrossref
    18.
    Donabedian  A.  The quality of care. how can it be assessed?  JAMA. 1988;260(12):1743-1748. doi:10.1001/jama.1988.03410120089033PubMedGoogle ScholarCrossref
    19.
    Huang  LC, Tran  TB, Ma  Y, Ngo  JV, Rhoads  KF.  Factors that influence minority use of high-volume hospitals for colorectal cancer care.  Dis Colon Rectum. 2015;58(5):526-532. doi:10.1097/DCR.0000000000000353PubMedGoogle ScholarCrossref
    20.
    Wong  RS, Vikram  B, Govern  FS,  et al.  National Cancer Institute’s Cancer Disparities Research Partnership Program: experience and lessons learned.  Front Oncol. 2014;4:303. doi:10.3389/fonc.2014.00303PubMedGoogle ScholarCrossref
    21.
    Luckenbaugh  AN, Miller  DC, Ghani  KR.  Collaborative quality improvement.  Curr Opin Urol. 2017;27(4):395-401. doi:10.1097/MOU.0000000000000404PubMedGoogle ScholarCrossref
    22.
    Haas  S, Gawande  A, Reynolds  ME.  The risks to patient safety from health system expansions.  JAMA. 2018;319(17):1765-1766. doi:10.1001/jama.2018.2074PubMedGoogle ScholarCrossref
    23.
    Hollenbeck  BK, Daignault  S, Dunn  RL, Gilbert  S, Weizer  AZ, Miller  DC.  Getting under the hood of the volume-outcome relationship for radical cystectomy.  J Urol. 2007;177(6):2095-2099. doi:10.1016/j.juro.2007.01.153PubMedGoogle ScholarCrossref
    24.
    Kowalski  C, Lee  SY, Ansmann  L, Wesselmann  S, Pfaff  H.  Meeting patients’ health information needs in breast cancer center hospitals—a multilevel analysis.  BMC Health Serv Res. 2014;14:601. doi:10.1186/s12913-014-0601-6PubMedGoogle ScholarCrossref
    ×