[Skip to Navigation]
Sign In
Figure.  Simple Means Comparison of Evaluation and Management Visits Between Clinicians Participating in the Oncology Care Model (OCM) vs Those Not Participating
Simple Means Comparison of Evaluation and Management Visits Between Clinicians Participating in the Oncology Care Model (OCM) vs Those Not Participating
Table 1.  Preperiod Practice Attributes Stratified by OCM Status
Preperiod Practice Attributes Stratified by OCM Status
Table 2.  Difference-in-Differences Estimates for Drug Administrations, Drug Costs, and Overall Costs
Difference-in-Differences Estimates for Drug Administrations, Drug Costs, and Overall Costs
Table 3.  Difference-in-Differences Estimates for Evaluation and Management and Hydration Services
Difference-in-Differences Estimates for Evaluation and Management and Hydration Services
Table 4.  Triple Differences Estimates for Statistically Significant Difference-in-Differences Estimates
Triple Differences Estimates for Statistically Significant Difference-in-Differences Estimates
1.
Center for Medicare & Medicaid Services. Oncology Care Model overview. Accessed April 21, 2020. https://innovation.cms.gov/files/slides/ocm-overview-slides.pdf
2.
Colla  CH, Lewis  VA, Shortell  SM, Fisher  ES.  First national survey of ACOs finds that physicians are playing strong leadership and ownership roles.   Health Aff (Millwood). 2014;33(6):964-971. doi:10.1377/hlthaff.2013.1463PubMedGoogle ScholarCrossref
3.
Cole  AP, Krasnova  A, Ramaswamy  A,  et al.  Recommended cancer screening in accountable care organizations: trends in colonoscopy and mammography in the Medicare shared savings program.   J Oncol Pract. 2019;15(6):e547-e559. doi:10.1200/JOP.18.00352PubMedGoogle ScholarCrossref
4.
Kline  RM, Bazell  C, Smith  E, Schumacher  H, Rajkumar  R, Conway  PH.  Centers for Medicare and Medicaid services: using an episode-based payment model to improve oncology care.   J Oncol Pract. 2015;11(2):114-116. doi:10.1200/JOP.2014.002337PubMedGoogle ScholarCrossref
5.
Centers for Medicare & Medicaid Services. Oncology Care Model. Accessed April 16, 2020. https://innovation.cms.gov/innovation-models/oncology-care
6.
Kline  RM, Muldoon  LD, Schumacher  HK,  et al.  Design challenges of an episode-based payment model in oncology: the Centers for Medicare & Medicaid Services Oncology Care Model.   J Oncol Pract. 2017;13(7):e632-e645. doi:10.1200/JOP.2016.015834PubMedGoogle ScholarCrossref
7.
Schumacher  H.  Funding cancer quality improvement: payer’s perspective.   J Oncol Pract. 2015;11(3):180-181. doi:10.1200/JOP.2015.003913PubMedGoogle ScholarCrossref
8.
McWilliams  JM, Hatfield  LA, Chernew  ME, Landon  BE, Schwartz  AL.  Early performance of accountable care organizations in Medicare.   N Engl J Med. 2016;374(24):2357-2366. doi:10.1056/NEJMsa1600142PubMedGoogle ScholarCrossref
9.
McWilliams  JM, Hatfield  LA, Landon  BE, Hamed  P, Chernew  ME.  Medicare spending after 3 years of the Medicare Shared Savings Program.   N Engl J Med. 2018;379(12):1139-1149. doi:10.1056/NEJMsa1803388PubMedGoogle ScholarCrossref
10.
McWilliams  JM, Chernew  ME, Landon  BE, Schwartz  AL.  Performance differences in year 1 of pioneer accountable care organizations.   N Engl J Med. 2015;372(20):1927-1936. doi:10.1056/NEJMsa1414929PubMedGoogle ScholarCrossref
11.
Trombley  MJ, Fout  B, Brodsky  S, McWilliams  JM, Nyweide  DJ, Morefield  B.  Early effects of an accountable care organization model for underserved areas.   N Engl J Med. 2019;381(6):543-551. doi:10.1056/NEJMsa1816660PubMedGoogle ScholarCrossref
12.
Kaufman  BG, Spivack  BS, Stearns  SC, Song  PH, O’Brien  EC.  Impact of accountable care organizations on utilization, care, and outcomes: a systematic review.   Med Care Res Rev. 2019;76(3):255-290. doi:10.1177/1077558717745916PubMedGoogle ScholarCrossref
13.
Cole  AP, Krasnova  A, Ramaswamy  A,  et al.  Prostate cancer in the Medicare shared savings program: are accountable care organizations associated with reduced expenditures for men with prostate cancer?   Prostate Cancer Prostatic Dis. 2019;22(4):593-599. doi:10.1038/s41391-019-0138-1PubMedGoogle ScholarCrossref
14.
Shenolikar  R, Ryan  K, Shand  B, Kane  R.  Impact of Oncology Care Model (OCM) on episode costs and performance revenues: considerations for oncology practices.   J Clin Oncol. 2018;36(30)(suppl):102. doi:10.1200/JCO.2018.36.30_suppl.102Google ScholarCrossref
15.
Theroux  HH,  et al.  Influence of pomalidomide and lenalidomide on total cost of care for Medicare beneficiaries with multiple myeloma under the Oncology Care Model (OCM).   Blood. 2018;132(suppl 1):2252-2252. doi:10.1182/blood-2018-99-116657Google ScholarCrossref
16.
Parikh  RB, Bekelman  JE, Huang  Q, Martinez  JR, Emanuel  EJ, Navathe  AS.  Characteristics of physicians participating in Medicare’s Oncology Care Model bundled payment program.   J Oncol Pract. 2019;15(10):e897-e905. doi:10.1200/JOP.19.00047PubMedGoogle ScholarCrossref
17.
Mendenhall  MA, Dyehouse  K, Hayes  J,  et al.  Practice transformation: early impact of the oncology care model on hospital admissions.   J Oncol Pract. 2018;14(12):JOP1800409. doi:10.1200/JOP.18.00409PubMedGoogle Scholar
18.
Schleicher  SM, Chaudhry  B, Waynick  CA,  et al.  The effect of guideline-concordant novel therapy use on meeting cost targets in OCM: results from a large community oncology network.   J Clin Oncol. 2019;27(15 suppl):6635. doi:10.1200/JCO.2019.37.15_suppl.6635Google ScholarCrossref
19.
Song  A, Csik  VP, Leader  A, Maio  V.  The Oncology Care Model: oncology’s first foray away from volume and toward value-based care.   Am J Med Qual. 2019;34(4):321-323. doi:10.1177/1062860618824016PubMedGoogle ScholarCrossref
20.
Ennis  RD, Parikh  AB, Sanderson  M, Liu  M, Isola  L.  Interpreting Oncology Care model data to drive value-based care: a prostate cancer analysis.   J Oncol Pract. 2019;15(3):e238-e246. doi:10.1200/JOP.18.00336 PubMedGoogle ScholarCrossref
21.
Li  S, Peng  Y, Raskin  L,  et al.  Variations in hospitalization and emergency department or observation (ED/OB) stays using the Oncology Care Model (OCM) methodology in Medicare data.   J Clin Oncol. 2018;36(30 suppl):112. doi:10.1200/JCO.2018.36.30_suppl.112 Google ScholarCrossref
22.
McInnes  S, Carrino  CM, Shoemaker  L.  Frontline oncology care team primary palliative symptom guideline education, the Oncology Care Model, and emergency department visits.   J Clin Oncol. 2018;36(34 suppl):143. doi:10.1200/JCO.2018.36.34_suppl.143 Google ScholarCrossref
23.
Hoverman  JR, Taniguchi  CB, Hayes  J, Eagye  K, Mann  BB, Neubauer  MA.  Unraveling the high cost of end-of-life care: an Oncology Care Model experience.   J Clin Oncol. 2019;37(15 suppl):11534. doi:10.1200/JCO.2019.37.15_suppl.11534 Google ScholarCrossref
24.
Perry  M, Rudy-Tomczak  K, Hines  S.  A process for improving patient survey scores in the Oncology Care Model (OCM).   J Clin Oncol. 2018;36(30 suppl):222. doi:10.1200/JCO.2018.36.30_suppl.222 Google ScholarCrossref
25.
Abt Associates. Evaluation of the Oncology Care Model: performance period one. Revised December 2018. Accessed April 16, 2020. https://innovation.cms.gov/files/reports/ocm-secondannualeval-pp1.pdf
26.
Clough  JD, Kamal  AH.  Oncology Care Model: short-and long-term considerations in the context of broader payment reform.   J Oncol Pract. 2015;11(4):319-321. doi:10.1200/JOP.2015.005777 PubMedGoogle ScholarCrossref
27.
Gutkin  D, Zhao  E, Phillips  K, Cavaliere  K, Brau  H, Powell  B.  Trends related to program participation, implementation, best practices, challenges and resource requests among Oncology Care Model (OCM) participants.   J Clin Oncol. 2018;36(15 suppl):6524. doi:10.1200/JCO.2018.36.15_suppl.6524Google ScholarCrossref
28.
World Health Organization.  International Classification of Diseases, Ninth Revision (ICD-9). World Health Organization; 1977.
29.
Dusetzina  SB, Keating  NL.  Mind the gap: why closing the doughnut hole is insufficient for increasing Medicare beneficiary access to oral chemotherapy.   J Clin Oncol. 2016;34(4):375-380. doi:10.1200/JCO.2015.63.7736 PubMedGoogle ScholarCrossref
30.
McWilliams  JM, Hatfield  LA, Landon  BE, Chernew  ME.  Spending Reductions in the Medicare Shared Savings Program: Selection or Savings? National Bureau of Economic Research; 2019. doi:10.3386/w26403
Limit 200 characters
Limit 25 characters
Conflicts of Interest Disclosure

Identify all potential conflicts of interest that might be relevant to your comment.

Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.

Err on the side of full disclosure.

If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.

Not all submitted comments are published. Please see our commenting policy for details.

Limit 140 characters
Limit 3600 characters or approximately 600 words
    Original Investigation
    Health Policy
    May 18, 2020

    Evaluation of Practice Patterns Among Oncologists Participating in the Oncology Care Model

    Author Affiliations
    • 1Data, Evidence & Insights, McKesson Life Sciences, The Woodlands, Texas
    • 2Department of Health Policy and Management, Tulane University, New Orleans, Louisiana
    • 3Program Outcomes, McKesson Specialty Health, The Woodlands, Texas
    • 4The US Oncology Network, The Woodlands, Texas
    • 5Value Based Care and Quality Programs, Texas Oncology, Dallas
    JAMA Netw Open. 2020;3(5):e205165. doi:10.1001/jamanetworkopen.2020.5165
    Key Points español 中文 (chinese)

    Question  Is the adoption of novel oncological treatment payment models associated with different care choices by oncologists?

    Findings  This nonrandomized controlled study comparing the care choices of oncologists who participated in the Oncology Care Model with those who did not found that the first year of the program was associated with less physician-administered drug use in prostate cancer, lower drug costs in lung and prostate cancer, fewer visits for patients with breast or colon cancer, and lower office-based costs in all cancers analyzed, but these potential savings were offset by program costs.

    Meaning  These findings suggest that the Oncology Care Model was associated with lower health care utilization in the first year of implementation.

    Abstract

    Importance  Health insurers reimburse clinicians in many ways, including the ubiquitous fee-for-service model and the emergent shared-savings models. Evidence on the effects of these emergent models in oncological treatment remains limited.

    Objectives  To analyze the early use and cost associations of a recent Medicare payment program, the Oncology Care Model (OCM), which included a shared savings–like component.

    Design, Setting, and Participants  This nonrandomized controlled study used a difference-in-differences approach on 2 years of data, from July 1, 2015, to June 30, 2017—1 year before and 1 year after launch of the OCM—to compare the differences between participating and nonparticipating practices, controlling for patient, clinician, and practice factors. Participation in the OCM began on July 1, 2016. Associations of participation with care use and cost were estimated for care directly managed by clinicians from a large network within their Medicare populations for breast, lung, colon, and prostate cancers. Data were analyzed from September 2019 to March 2020.

    Exposures  Participating practices were paid a monthly management fee of $160 per beneficiary and a potential risk-adjusted performance-based payment for eligible patients who received chemotherapy treatment, in addition to standard fee-for-service payments.

    Main Outcomes and Measures  Office visits, drug administrations, patient hydrations, drug costs, and total costs.

    Results  Monthly means data at the physician-level were evaluated for 11 869 physician-months for breast cancers, 11 135 physician-months for lung cancers, 8592 physician-months for colon cancers, and 9045 physician-months for prostate cancers. Patients at OCM practices had a mean (SD) age of 63.4 (3.1) years, and a mean (SD) of 59% (7 percentage points) of their patients were women. Participation in the OCM was associated with less physician-administered prostate cancer drug use (difference, 0.29 [95% CI, –0.47 to –0.11] percentage points, or 24.0%) translating to a mean of $706 (95% CI, –$1383 to –$29) less in drug costs per month. Monthly drug costs were also lower, at $558 (95% CI, –$1173 to $58) less for treatment for lung cancer. Total costs were lower by 9.7% or $233 (95% CI, –$495 to $30) for breast cancer, 9.9% or $337 (95% CI, –$618 to –$55) for lung cancer, 14.2% or $385 (95% CI, –$780 to $10) for colon cancer, and 29.2% or $610 (95% CI, –$1095 to –$125) for prostate cancer; however, these differences were largely offset by program costs. Clinician visits were also lower by 11.2% or 0.11 (95% CI, –0.20 to –0.01) percentage points among patients with breast cancer and by 14.4% or 0.19 (95% CI, –0.37 to –0.02) among patients with colon cancer.

    Conclusions and Relevance  These findings suggest that payment models with shared-savings components can be associated with fewer visits and lower costs in certain cancer settings in the first year, but the savings can be modest given the costs of program administration.

    Introduction

    The cost of cancer care is rapidly increasing. The National Institutes of Health estimated that costs for cancer care will increase by 27% to $158 billion dollars from 2010 to 2020.1 Consequently, insurers and health care groups are increasingly participating in payment models in which the health care practitioners are accountable for patient costs. In the first national survey of 173 large physician groups known as accountable care organizations (ACOs), 72% of physician-led ACOs reported that they had participated in a risk-sharing contract.2 These programs appear to be broadly adoptable.3 In this environment, the Oncology Care Model (OCM) was created under the Patient Protection and Affordable Care Act to test novel Medicare payment models that drive quality and cost-efficiency4 and provides an opportunity to evaluate whether these models are associated with different physician care choices in oncological treatment.

    The OCM sought to drive care coordination, navigation, and treatment guidelines use for chemotherapy-based care.5 To achieve this, 2 new forms of payment incentives were introduced1: a $160 per beneficiary Monthly Enhanced Oncology Services payment for each 6-month episode beginning with chemotherapy,2 and a Performance-Based Payment (PBP) predicated on cost efficiency.5 The PBP is based on an expected target cost per episode stemming from historical data trended to the performance period and adjusted based on the use of novel therapies, quality measures, and risk according to patient characteristics and geographic variation. As an example, an episode benchmark at $30 000 with a 4% savings target entails a $28 800 cost target. If quality-adjusted costs come to $25 000, then the practice could receive a PBP of up to the $3800 in savings achieved.1,6,7

    Shared-savings models outside of oncological treatment have been more rigorously evaluated. Early analysis of the Medicare Shared Savings Program found that savings did not offset outlays from payouts and unpenalized overruns by some of the participants,8 although mean lower spending associations deepened with time for private practices.9 Elsewhere, the Pioneer program introduced 32 practice groups to a 2-sided model (ie, participants were paid for coming below an agreed baseline and penalized for going above it) and was 1.2% less expensive compared with a control group.10 The ACO Investment Model has been found to be associated with $10 less spending per beneficiary per month.11 Several studies have suggested that ACOs are associated with improved preventive and chronic care and less inpatient and emergency care.11,12

    In oncological care, ACOs have not been associated with differences in prostate cancer spending13 and have mixed results in screening, with no differences in mammography but higher colonoscopy rates.3 For the OCM specifically, a 2018 study14 simulated the benchmarks on Medicare data and found that the ratio of actual-to-predicted costs varied by tumor type. Another 2018 study15 found that within the first year of OCM implementation, episode costs were highly sensitive to the use of high-cost therapies. Taken together, these results suggest that tumor mix may drive success within the model. A cross-sectional analysis by Parikh et al16 found that oncologists participating in the OCM varied in their demographic characteristics, care intensity, and exposure to alternative models compared with nonparticipating oncologists in ways that practice-level analyses might meaningfully miss, motivating physician-level analysis. While there is some early suggestive evidence that the OCM altered hospitalization rates17 and other descriptive or non–peer-reviewed work,16,18-27 there remain few assessments of the OCM’s potential effects on care choices.14 This study evaluates whether OCM participation was associated with clinician behavior within the context of their practices.

    Methods
    Study Design, Population, and Setting

    This study was approved under the exemption criteria by the US Oncology institutional review board, which is recognized as the institutional review board of record by Tulane University, New Orleans, Louisiana. This study is reported following the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) reporting guideline for reporting observational studies.

    This analysis includes practices affiliated with a large community oncological practice network. This network includes more than 1400 oncologists affiliated with 30 practices spread across approximately 400 locations in 17 states, of which 14 practices are participants in the OCM and 16 practices are not. Practices decide independently whether to participate. All oncological care performed at the OCM participant practices was considered the treatment group, while all oncological care performed at nonparticipating practices was considered the control group.

    While these practices were not randomized, they had similar care patterns before joining the model (Figure). To formally test whether participating clinicians were treating different patient populations than nonparticipating clinicians prior to participation (eg, selection bias), we conducted 2 analyses. The first analysis assesses whether differences in patient diagnoses and payer mixes are associated with participation. Primary diagnoses were grouped using International Classification of Diseases, Ninth Revision28 neoplasm categories (International Statistical Classification of Diseases and Related Health Problems, Tenth Revision codes were cross-walked to these groups if used). Primary insurance was also in the data. Monthly physician-level preperiod means were generated and analyzed for each of the following categories: Medicare, private insurance, Medicaid, Medicare Advantage, breast cancer, genitourinary cancer, lymphatic cancer, lung cancer, colon cancer, prostate cancer, respiratory cancer (excluding lung cancer), and digestive cancer (excluding colon cancer). The second analysis focuses on other control variables, including monthly physician-level patient means (eg, age, sex, chemotherapy administrations), total unique patients seen, and total number of physicians at the practice. Exact specifications can be found in the eAppendix in the Supplement.

    The OCM policy in this setting is otherwise well suited to examine the associations of these models generally. The practices evaluated here exhibit many features that make them compelling comparators: they are part of the same network and thus pay similar prices for therapies and share common treatment pathways (ie, their care choices are unlikely to be driven by different drug prices or preferences), they have the same technologies by way of electronic health records and financial management systems and thus face the same information, they host several networking conferences throughout the year, and they share some common back-office staff. These latter attributes may lead to spillovers between participant and nonparticipant practices, which would downward bias the estimates and suggest that the measured associations might represent lower bounds.

    Data and Study Population

    This study used reimbursement data from the network (eFigure 1 in the Supplement). Included patients were those whose care was provided by practices affiliated with this network from July 1, 2015, through June 30, 2017 (ie, 1 year prior to OCM implementation to 1 year after OCM implementation). Given the variation in payment incentives by cancer subtype that could drive changes in patient mix, we focused on within-cancer subtype analysis. Accordingly, analyses were conducted on breast, prostate, lung, and colon cancer subsets within the traditional Medicare population. The observations are clinician-by-month means or totals.

    Outcome Measures

    Five different outcomes were assessed for each cancer: mean monthly drug administrations among patients receiving these treatments, mean monthly patient drug costs among patients receiving these treatments, mean monthly patient costs, mean monthly evaluation and management visits, and mean monthly hydration services. All costs are total paid amounts, including patient out-of-pocket payments. Drug administration and drug costs are for any drug used and billed for in the office setting. Hydration services are for any claim with Current Procedural Terminology (CPT) code 96360 (ie, intravenous infusion, hydration; initial, 31 minutes to 1 hour) or CPT code 96361 (ie, intravenous infusion, hydration; each additional hour). Evaluation and management visits describe visits by any network clinician in any setting. The reimbursement data used pregrouped codes into this category; however, approximately 75% of evaluation and management observations included CPT codes 99211 to 99215 (each of which starts with the description office or other outpatient visits for the evaluation and management of an established patient).

    Statistical Analysis

    This study uses a difference-in-differences model that compares OCM participants with nonparticipants from 1 year before the program launched through 1 year after the program launched. The specification is run with each of the 5 outcomes for data subsets in breast, lung, colon, and prostate cancer. Our specification uses clinician-by-month observations. It controls for whether the clinician participated in the program, the month of care, the individual clinicians themselves, monthly clinician means (ie, share of patients who are women, mean patient age and age squared, share of patients who receive chemotherapy), total number of patients treated, total number of Medicare patients treated, and total number of clinicians at a given practice for a given month. Each control is included to account for mean differences by group as well as for any potential patient cohort mix changes through time that might otherwise contaminate the estimation. The key variable of interest is the interaction of OCM (coded as 1 for participants and 0 for nonparticipants) and postperiod (coded as 1 for the postimplementation period and 0 for the preimplementation period) (eAppendix in the Supplement).

    We also specified robustness and falsification checks. Both are triple differences models, which add 1 more dimension for comparison within the treatment groups. The intuition of these models is that the additional dimension may account for the associations measured in the difference-in-differences model. In this case, the robustness check adds the degree of Medicare exposure to test whether OCM participants with higher Medicare patient exposure disproportionately account for the overall results. The falsification check adds whether the practices also participate in in-office oral chemotherapy dispensing, which is associated with OCM and could disproportionately incentivize oral therapy use in the postimplementation period with changing oral market factors (eg, new oral drug launches, improved dispensing experience, better insurer contracts). This model formally checks for whether the dispensing practices themselves account for the associations separate from OCM participation and whether dispensing practices within the OCM account for the associations measured in the model. The key variables of interest are the interactions of OCM, postperiod, and Medicare (coded as share of patients who have Medicare), and separately the interactions of OCM, postperiod, and in-office oral chemotherapy dispensing (coded 1 for clinicians who dispense oral therapies and 0 for those who do not) (eAppendix in the Supplement).

    Statistical analyses were performed in SAS statistical software version 9.4 (SAS Institute). P values were 2-sided, and statistical significance was set at .10. Data were analyzed from September 2019 to March 2020.

    Results
    Study Participants

    We evaluated monthly patient means data at the physician-level for 11 869 physician-months for breast cancers, 11 135 physician-months for lung cancers, 8592 physician-months for colon cancers, and 9045 physician-months for prostate cancers. Mean (SD) monthly evaluation and management visits were 0.95 (0.76) visits for breast cancer, 1.23 (1.01) visits for lung cancer, 1.35 (1.12) visits for colon cancer and 1.04 (1.67) visits for prostate cancer. Mean (SD) total costs were $2411 ($2424) for breast cancer, $3407 ($3407) for lung cancer, $2708 ($3288) for colon cancer, and $2089 ($4004) for prostate cancer. Overall, patients at OCM practices had a mean (SD) age of 63.4 (3.1 years), a mean (SD) of 59% (7 percentage points) of their patients were women. Differences by participation status are presented in Table 1.

    Selection Analysis

    In our tests for whether participating clinicians were different from nonparticipating clinicians prior to participation, we found only 1 statistically significant estimate among the 12 variables in our first test, which is approximately what could be expected due to random chance at P < .10. We also found no statistically significant estimates in our second test (Table 1). This suggests that participation was not driven by statistically significantly different clinicians within this network. Importantly, treatment and control practices did not overlap by state, so we were able to control for each clinician. This largely accounts for practice or state characteristics that did not change with time but might sort practices into or out of the OCM. One large outlier practice was excluded from this study owing to distinct and unanticipated preperiod patterns. Specifically, there were significant decreases in utilization 6 months prior to OCM launch that were not observed in the other practices. This undermines their ability to serve as an appropriate comparison group across all other practice organizations given that the estimation methods require participants and nonparticipants to trend similarly before program launch.

    Main Results

    The estimate on drug administrations for prostate cancer was statistically significant, at −0.29 (95% CI, –0.47 to –0.11; P = .003), meaning that OCM participation was associated with 24.0% fewer drug administrations relative to the mean. This corresponds with a monthly reduction in mean drug costs for prostate cancer of $706 (95% CI, –$1383 to –$29; P = .04), or 22.9% relative to the mean. The estimate for mean drug costs was also statistically significant for lung cancer, at –$558 (95% CI, –$1173 to $58; P = .07), or 3.1% relative to the mean (Table 2).

    Participation in the OCM was also associated with lower overall costs for each disease area: –$233 (95% CI, –$495 to $30; P = .08) or –9.7% relative to the mean for breast cancer, –$337 (95% CI, –$618 to –$55; P = .02) or –9.9% relative to the mean for lung cancer, –$385 (95% CI, –$780 to $10; P = .06) or –14.2% relative to the mean for colon cancer, and –$610 (95% CI, –$1095 to –$125; P = .02) or –29.2% relative to the mean for prostate cancer. These amounts exclude the $160 Monthly Enhanced Oncology Services payout and any PBP payout for the savings achieved through the program. Including the Monthly Enhanced Oncology Services payouts, costs would shift the point estimates by $160, such that each estimate would lose statistical significance save for the estimate on prostate cancer. Evaluation and monitoring visits were also lower for breast cancer at –0.11 (95% CI, –0.20 to –0.01; P = .03) or –11.2% relative to the mean, and for colon cancer at –0.19 (95% CI, –0.37 to –0.02; P = .03) or –14.4% relative to the mean. All other estimates were not statistically significant (Table 2 and Table 3).

    Robustness and Falsification Checks Results

    Among all statistically significant difference-in-differences estimates, the only triple differences estimate that was statistically significant was on overall costs for prostate cancer, at –$2676 (95% CI, –$5735 to $384; P = .08). That is, in general, each 10 percentage points of increased exposure to Medicare patients with prostate cancer was associated with $268 less spending among clinics that participated in the OCM (Table 4). There was also 1 statistically significant triple differences estimate from among the difference-in-differences estimates that was not itself statistically significant. That was for prostate cancer hydrations services at –0.41 (95% CI, –0.83 to 0.01; P = .06), which suggests that OCM clinicians with more Medicare patient exposure were associated with fewer hydrations (eTable 1 in the Supplement).

    As for the in-office oral chemotherapy dispensing triple differences results, we found that the estimate for drug administrations in prostate cancer was not statistically significant, at 0.10 (95% CI, –0.25 to 0.44; P = .57), suggesting that the relative association in treatment intensity was not accounted for by being an in-office oral chemotherapy dispensing practice. We also found that the mean cost of the drugs used was not statistically significant, at –$902 (95% CI, –$2074 to $269; P = .13). However, we found that the estimate for overall costs was –$1096 (95% CI, –$1736 to –$456; P = .002), suggesting that clinicians who are in the OCM and also dispense oral therapies disproportionately account for the overall OCM associations in prostate cancer (eTable 2 in the Supplement).

    Discussion

    Novel payment models seeking to drive quality and cost-efficiency represent important and increasingly prevalent tools used to curb the increase in health care costs.1-4 With cancer care costs especially on the increase, the OCM, inclusive of a shared-savings component in a specialty with prescribing discretion over consequential high-cost care choices, represents a novel environment to evaluate these programs. This nonrandomized controlled study adds to the emerging literature16-27 by finding that first-year OCM participation was associated with lower office-based costs. However, consistent with prior research on similar programs, we also found that these savings were largely offset by the costs of these programs.8,9

    Our results themselves appear internally consistent. Participation in the OCM was associated with fewer breast and colon cancer visits, lower drug costs for lung and prostate cancer, and accordingly, lower total costs for all 4 cancer subtypes. Prostate cancer drug costs were lower at least in part owing to fewer administrations, and participating clinicians who treated more Medicare patients disproportionately accounted for fewer office-based hydrations and for lower costs. Finally, practices that both dispensed oral therapies and participated in the OCM were associated with lower relative total prostate cancer costs compared with OCM practices that did not dispense oral therapies. This could be reflective of a disproportionate shift toward oral therapies by these in-office oral chemotherapy dispensing clinicians, which we cannot verify in our data, and would be consistent with our finding no differences in visits and fewer drug administrations.

    Our results also highlight several open questions. Does lower utilization in the office setting lead to differences in hospital-based care? Early research suggests this possibility.17 How would a potential change in prostate cancer care—be it a reduction or a shift to the pharmacy benefit—be associated with patient financial health? Research into oral chemotherapy access suggests this possibility.29 Further research into these and related topics would deepen our understanding of these models.

    Limitations

    Our study has several limitations. The first is that the treatment and control groups were not randomized and potential unobservable characteristics that sorted practices and are associated with the outcomes would engender selection bias. This concern is moderated, but not abated, by similar preperiod trends, an apparent lack of selection along covariables, and recent research on the limited role that selection into these models plays in affecting the overall results.30 The research is limited to clinician decision-making within the confines of the practice setting, and hospitalization and pharmacy data are notably absent. The PBP payment is not visible in the data, so the program’s total costs are not considered. We decided not to balance the panel to avoid survivorship bias, given the oncological setting. Additionally, the literature suggests that divergence in care can take a few years, so evaluation of longer periods in future research could help to elucidate these potential effects in oncological treatment.

    Conclusions

    This nonrandomized controlled study suggests that the OCM was associated with different care decisions made by clinicians. The magnitude of these overall associations needs to be weighed against the costs and potential savings incurred elsewhere (eg, hospital and pharmacy). However, the OCM was associated with different care decisions made upstream to those settings within the confines of the physician’s office, especially in prostate cancer.

    Back to top
    Article Information

    Accepted for Publication: March 14, 2020.

    Published: May 18, 2020. doi:10.1001/jamanetworkopen.2020.5165

    Open Access: This is an open access article distributed under the terms of the CC-BY-NC-ND License. © 2020 Walker B et al. JAMA Network Open.

    Corresponding Author: Brigham Walker, PhD, Department of Health Policy and Management, Tulane University, 1440 Canal St, Ste 1937, New Orleans, LA 70112 (bwalker6@tulane.edu).

    Author Contributions: Dr Walker had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

    Concept and design: Walker, Hayes, Neubauer, Robert, Wilfong.

    Acquisition, analysis, or interpretation of data: Walker, Frytak, Wilfong.

    Drafting of the manuscript: Walker, Wilfong.

    Critical revision of the manuscript for important intellectual content: All authors.

    Statistical analysis: Walker.

    Obtained funding: Walker, Robert.

    Administrative, technical, or material support: Walker, Frytak, Hayes, Neubauer, Wilfong.

    Supervision: Walker, Frytak, Hayes, Robert, Wilfong.

    Conflict of Interest Disclosures: Dr Walker reported receiving grants from Blue Cross Blue Shield of Louisiana for separate health insurance reform research outside the submitted work. Dr Frytak reported owning stock in McKesson Specialty Health outside the submitted work. No other disclosures were reported.

    Funding/Support: McKesson Corporation and US Oncology provided access to the data used in this study and provided the funding for publication fees for this article.

    Role of the Funder/Sponsor: McKesson Corporation and US Oncology had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

    Disclaimer: All authors are employees of McKesson Corporation and/or The US Oncology Network. The views expressed in this article are our own.

    Additional Contributions: The US Oncology Health Services Research committee, led by Diana Verrilli, MS, and Beatrice Mautner, MSN (McKesson Corporation), supported this research. Mittsy Mosshart, BBA; Beth Alvarez, MS; and Allen Kowalczyk, BS (McKesson Corporation), extracted the data used in this study. J. Michael McWilliams, MD, PhD (Harvard Medical School), provided helpful perspectives in the early formulation of the study. Kevin Callison, PhD; Mary K. Olson, PhD; Jim Alm, PhD; and Patrick Button, PhD (Tulane University), provided helpful feedback to drafts of this paper. They were not compensated for their work.

    References
    1.
    Center for Medicare & Medicaid Services. Oncology Care Model overview. Accessed April 21, 2020. https://innovation.cms.gov/files/slides/ocm-overview-slides.pdf
    2.
    Colla  CH, Lewis  VA, Shortell  SM, Fisher  ES.  First national survey of ACOs finds that physicians are playing strong leadership and ownership roles.   Health Aff (Millwood). 2014;33(6):964-971. doi:10.1377/hlthaff.2013.1463PubMedGoogle ScholarCrossref
    3.
    Cole  AP, Krasnova  A, Ramaswamy  A,  et al.  Recommended cancer screening in accountable care organizations: trends in colonoscopy and mammography in the Medicare shared savings program.   J Oncol Pract. 2019;15(6):e547-e559. doi:10.1200/JOP.18.00352PubMedGoogle ScholarCrossref
    4.
    Kline  RM, Bazell  C, Smith  E, Schumacher  H, Rajkumar  R, Conway  PH.  Centers for Medicare and Medicaid services: using an episode-based payment model to improve oncology care.   J Oncol Pract. 2015;11(2):114-116. doi:10.1200/JOP.2014.002337PubMedGoogle ScholarCrossref
    5.
    Centers for Medicare & Medicaid Services. Oncology Care Model. Accessed April 16, 2020. https://innovation.cms.gov/innovation-models/oncology-care
    6.
    Kline  RM, Muldoon  LD, Schumacher  HK,  et al.  Design challenges of an episode-based payment model in oncology: the Centers for Medicare & Medicaid Services Oncology Care Model.   J Oncol Pract. 2017;13(7):e632-e645. doi:10.1200/JOP.2016.015834PubMedGoogle ScholarCrossref
    7.
    Schumacher  H.  Funding cancer quality improvement: payer’s perspective.   J Oncol Pract. 2015;11(3):180-181. doi:10.1200/JOP.2015.003913PubMedGoogle ScholarCrossref
    8.
    McWilliams  JM, Hatfield  LA, Chernew  ME, Landon  BE, Schwartz  AL.  Early performance of accountable care organizations in Medicare.   N Engl J Med. 2016;374(24):2357-2366. doi:10.1056/NEJMsa1600142PubMedGoogle ScholarCrossref
    9.
    McWilliams  JM, Hatfield  LA, Landon  BE, Hamed  P, Chernew  ME.  Medicare spending after 3 years of the Medicare Shared Savings Program.   N Engl J Med. 2018;379(12):1139-1149. doi:10.1056/NEJMsa1803388PubMedGoogle ScholarCrossref
    10.
    McWilliams  JM, Chernew  ME, Landon  BE, Schwartz  AL.  Performance differences in year 1 of pioneer accountable care organizations.   N Engl J Med. 2015;372(20):1927-1936. doi:10.1056/NEJMsa1414929PubMedGoogle ScholarCrossref
    11.
    Trombley  MJ, Fout  B, Brodsky  S, McWilliams  JM, Nyweide  DJ, Morefield  B.  Early effects of an accountable care organization model for underserved areas.   N Engl J Med. 2019;381(6):543-551. doi:10.1056/NEJMsa1816660PubMedGoogle ScholarCrossref
    12.
    Kaufman  BG, Spivack  BS, Stearns  SC, Song  PH, O’Brien  EC.  Impact of accountable care organizations on utilization, care, and outcomes: a systematic review.   Med Care Res Rev. 2019;76(3):255-290. doi:10.1177/1077558717745916PubMedGoogle ScholarCrossref
    13.
    Cole  AP, Krasnova  A, Ramaswamy  A,  et al.  Prostate cancer in the Medicare shared savings program: are accountable care organizations associated with reduced expenditures for men with prostate cancer?   Prostate Cancer Prostatic Dis. 2019;22(4):593-599. doi:10.1038/s41391-019-0138-1PubMedGoogle ScholarCrossref
    14.
    Shenolikar  R, Ryan  K, Shand  B, Kane  R.  Impact of Oncology Care Model (OCM) on episode costs and performance revenues: considerations for oncology practices.   J Clin Oncol. 2018;36(30)(suppl):102. doi:10.1200/JCO.2018.36.30_suppl.102Google ScholarCrossref
    15.
    Theroux  HH,  et al.  Influence of pomalidomide and lenalidomide on total cost of care for Medicare beneficiaries with multiple myeloma under the Oncology Care Model (OCM).   Blood. 2018;132(suppl 1):2252-2252. doi:10.1182/blood-2018-99-116657Google ScholarCrossref
    16.
    Parikh  RB, Bekelman  JE, Huang  Q, Martinez  JR, Emanuel  EJ, Navathe  AS.  Characteristics of physicians participating in Medicare’s Oncology Care Model bundled payment program.   J Oncol Pract. 2019;15(10):e897-e905. doi:10.1200/JOP.19.00047PubMedGoogle ScholarCrossref
    17.
    Mendenhall  MA, Dyehouse  K, Hayes  J,  et al.  Practice transformation: early impact of the oncology care model on hospital admissions.   J Oncol Pract. 2018;14(12):JOP1800409. doi:10.1200/JOP.18.00409PubMedGoogle Scholar
    18.
    Schleicher  SM, Chaudhry  B, Waynick  CA,  et al.  The effect of guideline-concordant novel therapy use on meeting cost targets in OCM: results from a large community oncology network.   J Clin Oncol. 2019;27(15 suppl):6635. doi:10.1200/JCO.2019.37.15_suppl.6635Google ScholarCrossref
    19.
    Song  A, Csik  VP, Leader  A, Maio  V.  The Oncology Care Model: oncology’s first foray away from volume and toward value-based care.   Am J Med Qual. 2019;34(4):321-323. doi:10.1177/1062860618824016PubMedGoogle ScholarCrossref
    20.
    Ennis  RD, Parikh  AB, Sanderson  M, Liu  M, Isola  L.  Interpreting Oncology Care model data to drive value-based care: a prostate cancer analysis.   J Oncol Pract. 2019;15(3):e238-e246. doi:10.1200/JOP.18.00336 PubMedGoogle ScholarCrossref
    21.
    Li  S, Peng  Y, Raskin  L,  et al.  Variations in hospitalization and emergency department or observation (ED/OB) stays using the Oncology Care Model (OCM) methodology in Medicare data.   J Clin Oncol. 2018;36(30 suppl):112. doi:10.1200/JCO.2018.36.30_suppl.112 Google ScholarCrossref
    22.
    McInnes  S, Carrino  CM, Shoemaker  L.  Frontline oncology care team primary palliative symptom guideline education, the Oncology Care Model, and emergency department visits.   J Clin Oncol. 2018;36(34 suppl):143. doi:10.1200/JCO.2018.36.34_suppl.143 Google ScholarCrossref
    23.
    Hoverman  JR, Taniguchi  CB, Hayes  J, Eagye  K, Mann  BB, Neubauer  MA.  Unraveling the high cost of end-of-life care: an Oncology Care Model experience.   J Clin Oncol. 2019;37(15 suppl):11534. doi:10.1200/JCO.2019.37.15_suppl.11534 Google ScholarCrossref
    24.
    Perry  M, Rudy-Tomczak  K, Hines  S.  A process for improving patient survey scores in the Oncology Care Model (OCM).   J Clin Oncol. 2018;36(30 suppl):222. doi:10.1200/JCO.2018.36.30_suppl.222 Google ScholarCrossref
    25.
    Abt Associates. Evaluation of the Oncology Care Model: performance period one. Revised December 2018. Accessed April 16, 2020. https://innovation.cms.gov/files/reports/ocm-secondannualeval-pp1.pdf
    26.
    Clough  JD, Kamal  AH.  Oncology Care Model: short-and long-term considerations in the context of broader payment reform.   J Oncol Pract. 2015;11(4):319-321. doi:10.1200/JOP.2015.005777 PubMedGoogle ScholarCrossref
    27.
    Gutkin  D, Zhao  E, Phillips  K, Cavaliere  K, Brau  H, Powell  B.  Trends related to program participation, implementation, best practices, challenges and resource requests among Oncology Care Model (OCM) participants.   J Clin Oncol. 2018;36(15 suppl):6524. doi:10.1200/JCO.2018.36.15_suppl.6524Google ScholarCrossref
    28.
    World Health Organization.  International Classification of Diseases, Ninth Revision (ICD-9). World Health Organization; 1977.
    29.
    Dusetzina  SB, Keating  NL.  Mind the gap: why closing the doughnut hole is insufficient for increasing Medicare beneficiary access to oral chemotherapy.   J Clin Oncol. 2016;34(4):375-380. doi:10.1200/JCO.2015.63.7736 PubMedGoogle ScholarCrossref
    30.
    McWilliams  JM, Hatfield  LA, Landon  BE, Chernew  ME.  Spending Reductions in the Medicare Shared Savings Program: Selection or Savings? National Bureau of Economic Research; 2019. doi:10.3386/w26403
    ×