[Skip to Navigation]
Sign In
Table 1.  Characteristics of Oregon and Colorado Reform Models
Characteristics of Oregon and Colorado Reform Models
Table 2.  Comparison of Propensity Score–Weighted Oregon and Colorado Medicaid Enrolleesa
Comparison of Propensity Score–Weighted Oregon and Colorado Medicaid Enrolleesa
Table 3.  Change in Utilization in the Oregon and Colorado Medicaid Populations
Change in Utilization in the Oregon and Colorado Medicaid Populations
Table 4.  Change in Utilization in the Oregon and Colorado Child and Adult Medicaid Populations
Change in Utilization in the Oregon and Colorado Child and Adult Medicaid Populations
Table 5.  Change in Performance on Measures of Access, Preventable Hospitalizations, and Low-Value Care in Oregon and Colorado Medicaid Groups
Change in Performance on Measures of Access, Preventable Hospitalizations, and Low-Value Care in Oregon and Colorado Medicaid Groups
1.
National Association of State Budget Officers. State Expenditure Report: examining fiscal 2012-2014 state spending. Washington DC: National Association of State Budget Officers. https://higherlogicdownload.s3.amazonaws.com/NASBO/9d2d2db1-c943-4f1b-b750-0fca152d64c2/UploadedImages/SER_Archive/State_Expenditure_Report_Fiscal_2012_2014_S.pdf. Published 2014. Accessed January 7, 2016.
2.
Kocot  SL, Dang-Vu  C, White  R, McClellan  M.  Early experiences with accountable care in Medicaid: special challenges, big opportunities.  Popul Health Manag. 2013;16(suppl 1):S4-S11.PubMedGoogle ScholarCrossref
3.
Center for Health Care Strategies Inc. Medicaid Accountable Care Organizations: state update. http://www.chcs.org/resource/medicaid-accountable-care-organizations-state-update/. Accessed August 15, 2016.
4.
McConnell  KJ, Chang  AM, Cohen  DJ,  et al.  Oregon’s Medicaid transformation: an innovative approach to holding a health system accountable for spending growth.  Healthc (Amst). 2014;2(3):163-167.PubMedGoogle ScholarCrossref
5.
Stecker  EC.  The Oregon ACO experiment—bold design, challenging execution.  N Engl J Med. 2013;368(11):982-985.PubMedGoogle ScholarCrossref
6.
Howard  SW, Bernell  SL, Yoon  J, Luck  J.  Oregon’s coordinated care organizations: a promising and practical reform model.  J Health Polit Policy Law. 2014;39(4):933-940.PubMedGoogle ScholarCrossref
7.
Pollack  HA.  Oregon’s coordinated care organizations.  J Health Polit Policy Law. 2014;39(4):929-931.PubMedGoogle ScholarCrossref
8.
Chang  AM, Cohen  DJ, McCarty  D, Rieckmann  T, McConnell  KJ.  Oregon’s Medicaid transformation—observations on organizational structure and strategy.  J Health Polit Policy Law. 2015;40(1):257-264.PubMedGoogle ScholarCrossref
9.
McConnell  KJ.  Oregon’s Medicaid Coordinated Care Organizations.  JAMA. 2016;315(9):869-870.PubMedGoogle ScholarCrossref
10.
Coughlin  TA, Corlette  S.  ACA implementation—monitoring and tracking; Oregon: site visit report. Urban Institute. http://www.urban.org/UploadedPDF/412498-ACA-Implementation-Monitoring-and-Tracking-Oregon-Site-Visit-Report.pdf. Published 2012. Accessed June 14, 2015.
11.
Oregon Health Authority. Oregon’s health system transformation: 2014 final report. http://www.oregon.gov/oha/Metrics/Documents/2014 Final Report-June 2015.pdf. Published June 24, 2015. Accessed November 1, 2015.
12.
Colorado Department of Health Care Policy and Financing. Creating a culture of change: accountable care collaborative, 2014 annual report. https://www.colorado.gov/pacific/sites/default/files/Accountable Care Collaborative 2014 Annual Report.pdf. Accessed March 13, 2016.
13.
Lindrooth  RC, Tung  G, Santos  T, O’Leary  S; Evaluation of the Accountable Care Collaborative: year 1 report. https://www.colorado.gov/pacific/sites/default/files/Supporting a Culture of Coverage Accountable Care Collaborative 2014-15 Annual Report.pdf. Accessed August 15, 2016.
14.
 Chronic Illness and Disability Payment System, Version 5.3. San Diego, CA: University of California; 2011.
15.
Stuart  EA, Huskamp  HA, Duckworth  K,  et al.  Using propensity scores in difference-in-differences models to estimate the effects of a policy change.  Health Serv Outcomes Res Methodol. 2014;14(4):166-182.PubMedGoogle ScholarCrossref
16.
Centers for Medicare & Medicaid Services. Berenson-Eggers Type of Service. https://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/MedicareFeeforSvcPartsAB/downloads/betosdesccodes.pdf. Published 2015. Accessed March 14, 2015.
17.
National Committee for Quality Assurance. Technical Specifications for Health Plans, Volume 2. Washington, DC: National Committee for Quality Assurance, HEDIS; 2014.
18.
Schwartz  AL, Chernew  ME, Landon  BE, McWilliams  JM.  Changes in low-value services in year 1 of the Medicare pioneer Accountable Care Organization program.  JAMA Intern Med. 2015;175(11):1815-1825.PubMedGoogle ScholarCrossref
19.
Medi-Cal Managed Care Division, California Department of Health Care Services. Statewide collaborative quality improvement project: reducing avoidable emergency room visits. http://www.dhcs.ca.gov/dataandstats/reports/Documents/MMCD_Qual_Rpts/EQRO_QIPs/CA2011-12_QIP_Coll_ER_Remeasure_Report.pdf. Published June 2012. Accessed February 10, 2015.
20.
Agency for Healthcare Research and Quality. Prevention Quality Indicators overview. http://www.qualityindicators.ahrq.gov/modules/pqi_overview.aspx. Accessed February 21, 2105.
21.
Joynt  KE, Gawande  AA, Orav  EJ, Jha  AK.  Contribution of preventable acute care spending to total spending for high-cost Medicare patients.  JAMA. 2013;309(24):2572-2578.PubMedGoogle ScholarCrossref
22.
Austin  PC.  A critical appraisal of propensity-score matching in the medical literature between 1996 and 2003.  Stat Med. 2008;27(12):2037-2049.PubMedGoogle ScholarCrossref
23.
Austin  PC.  Balance diagnostics for comparing the distribution of baseline covariates between treatment groups in propensity-score matched samples.  Stat Med. 2009;28(25):3083-3107.PubMedGoogle ScholarCrossref
24.
Ryan  AM, Burgess  JF  Jr, Dimick  JB.  Why we should not be indifferent to specification choices for difference-in-differences.  Health Serv Res. 2015;50(4):1211-1235.PubMedGoogle ScholarCrossref
25.
Angrist  JD, Pirschke  J-S.  Mostly Harmless Econometrics: An Empiricist’s Companion. Princeton, NJ: Princeton University Press; 2008.
26.
The Dartmouth Atlas of Healthcare. Dartmouth Atlas primary care service area data. http://www.dartmouthatlas.org/tools/downloads.aspx?tab=42. Accessed June 10, 2015.
27.
Bertrand  M, Duflo  E, Mullainathan  S.  How much should we trust differences-in-differences estimates?  Q J Econ. 2004;119(1):249-275.Google ScholarCrossref
28.
Osborne  NH, Nicholas  LH, Ryan  AM, Thumma  JR, Dimick  JB.  Association of hospital participation in a quality reporting program with surgical outcomes and expenditures for Medicare beneficiaries.  JAMA. 2015;313(5):496-504.PubMedGoogle ScholarCrossref
29.
The Henry J. Kaiser Family Foundation. Total monthly Medicaid and CHIP enrollment. http://kff.org/health-reform/state-indicator/total-monthly-medicaid-and-chip-enrollment/. Updated 2017. Accessed March 30, 2016.
30.
Dale  SB, Ghosh  A, Peikes  DN,  et al.  Two-year costs and quality in the comprehensive primary care initiative.  N Engl J Med. 2016;374(24):2345-2356.PubMedGoogle ScholarCrossref
31.
Young  K, Clemans-Cope  L, Lawton  E, Holahan  J. Medicaid spending growth in the great recession and its aftermath, FY 2007-2012. The Kaiser Foundation. https://kaiserfamilyfoundation.files.wordpress.com/2014/07/8309-03-medicaid-spending-growth-in-the-great-recession-and-its-aftermath-fy-2007-2012.pdf. Accessed July 11, 2014.
32.
Martin  AB, Hartman  M, Benson  J, Catlin  A; National Health Expenditure Accounts Team.  National health spending in 2014: faster growth driven by coverage expansion and prescription drug spending.  Health Aff (Millwood). 2016;35(1):150-160.PubMedGoogle ScholarCrossref
Original Investigation
April 2017

Early Performance in Medicaid Accountable Care Organizations: A Comparison of Oregon and Colorado

Author Affiliations
  • 1Center for Health Systems Effectiveness, Oregon Health & Science University, Portland
  • 2Department of Family Medicine, Oregon Health & Science University, Portland
  • 3Jefferson Center for Mental Health, Office of Healthcare Transformation, Wheat Ridge, Colorado
  • 4OHSU-PSU School of Public Health, Oregon Health & Science University, Portland
  • 5Hatfield School of Government, Portland State University, Portland
  • 6Department of Health Systems, Management and Policy, School of Public Health, University of Colorado, Denver
JAMA Intern Med. 2017;177(4):538-545. doi:10.1001/jamainternmed.2016.9098
Key Points

Question  How have expenditures, utilization, and quality changed in Oregon’s Medicaid Accountable Care Organization model in comparison with Colorado’s Medicaid Accountable Care Organization model?

Findings  In this study of 770 000 Medicaid enrollees, standardized expenditures for selected services decreased in both states during the years 2010-2014, with no significant difference between the states, although Oregon’s Medicaid Accountable Care Organization improved in some measures of access and quality compared with Colorado.

Meaning  Two years into implementation, Oregon’s Medicaid Accountable Care Organization, characterized by a large federal investment and movement to global budgets, exhibited improvements in some measures of care but no apparent differences in savings compared with the Colorado Medicaid Accountable Care Organization model, which was more limited in scope and implemented without substantial federal investments.

Abstract

Importance  Several state Medicaid reforms are under way, but the relative performance of different approaches is unclear.

Objective  To compare the performance of Oregon’s and Colorado’s Medicaid Accountable Care Organization (ACO) models.

Design, Setting, and Participants  Oregon initiated its Medicaid transformation in 2012, supported by a $1.9 billion investment from the federal government, moving most Medicaid enrollees into 16 Coordinated Care Organizations, which managed care within a global budget. Colorado initiated its Medicaid Accountable Care Collaborative in 2011, creating 7 Regional Care Collaborative Organizations that received funding to coordinate care with providers and connect Medicaid enrollees with community services. Data spanning July 1, 2010, through December 31, 2014 (18 months before intervention and 24 months after intervention, treating 2012 as a transition year) were analyzed for 452 371 Oregon and 330 511 Colorado Medicaid enrollees, assessing changes in outcomes using difference-in-differences analyses of regional focus, primary care homes, and care coordination. Oregon’s Coordinated Care Organization model was more comprehensive in its reform goals and in the imposition of downside financial risk.

Exposures  Regional focus, primary care homes, and care coordination in Medicaid ACOs.

Main Outcomes and Measures  Performance on claims-based measures of standardized expenditures and utilization for selected services, access, preventable hospitalizations, and appropriateness of care.

Results  In a total of 782 882 Medicaid enrollees, 45.0% were male, with mean (SD) age 16.74 (14.41) years. Standardized expenditures for selected services declined in both states during the 2010-2014 period, but these decreases were not significantly different between the 2 states. Oregon’s model was associated with reductions in emergency department visits (−6.28 per 1000 beneficiary-months; 95% CI, −10.51 to −2.05) and primary care visits (−15.09 visits per 1000 beneficiary-months; 95% CI, −26.57 to −3.61), improvements in acute preventable hospital admissions (−1.01 admissions per 1000 beneficiary-months; 95% CI, −1.61 to −0.42), 3 of 4 measures of access (well-child visits, ages 3-6 years, 2.69%; 95% CI, 1.20% to 4.19%; adolescent well-care visits, 6.77%; 95% CI, 5.22% to 8.32%; and adult access to preventive ambulatory care, 1.26%; 95% CI, 0.28% to 2.25%), and 1 of 4 measures of appropriateness of care (avoidance of head imaging for uncomplicated headache, 2.59%; 95% CI, 1.35% to 3.83%).

Conclusions and Relevance  Two years into implementation, Oregon’s and Colorado’s Medicaid ACO models exhibited similar performance on standardized expenditures for selected services. Oregon’s model, marked by a large federal investment and movement to global budgets, was associated with improvements in some measures of utilization, access, and quality, but Colorado’s model paralleled Oregon’s on several other metrics.

Introduction

Medicaid, the federal-state health insurance program for low-income individuals, has grown to cover more than 20% of the population nationally and accounts for a significant and growing portion of state budgets.1 This growth poses a significant budgetary challenge, even among states choosing not to expand coverage through the Affordable Care Act. States are experimenting with a wide range of policies designed to control spending, including payment reforms that mirror aspects of Accountable Care Organizations (ACOs) in the Medicare and commercial markets.2 As of 2016, a total of 9 states had launched Medicaid ACOs, with 8 states more actively pursuing this model.3

In this study, we compared the performance of 2 early adopters of the Medicaid ACO model: Oregon and Colorado. Oregon’s Medicaid transformation occurred in 2012. Supported in part by a $1.9 billion investment from the federal government, the state moved most (90%) of its Medicaid beneficiaries into 16 Coordinated Care Organizations (CCOs).4-9 Coordinated Care Organizations are community based, with governing boards that include representatives of the health care delivery system and consumers who reflect the community’s needs. Unlike most ACO models, CCOs accept full financial risk for their patient population and must manage all care (including mental health, addiction, and dental services) within a global budget. Oregon’s ambitious model has led some to refer to CCOs as “ACOs on steroids.”10

Colorado’s Medicaid Accountable Care Collaborative (ACC) reform was initiated in 2011, with the state creating 7 Regional Care Collaborative Organizations (RCCOs). The RCCOs receive per member per month funding to provide administrative support to improve connections between Medicaid enrollees, providers, and community services. Approximately 70% of Colorado Medicaid beneficiaries were enrolled in the ACC program by 2014. Unlike the Oregon CCO model, the ACC model maintained fee-for-service payments and did not impose downside financial risk on providers or RCCOs. Furthermore, Colorado did not receive federal investments on the scale of those provided to Oregon.

The objective of this study was to compare performance in Oregon’s CCO model with performance in the Colorado ACC model, using claims-based measures of expenditures, utilization, access, quality, and appropriateness of care. The Oregon and Colorado Medicaid agencies have described positive outcomes associated with their reforms, with both states reporting lower expenditures, reductions in utilization, and improvements in quality.9,11-13 However, a formal comparison allows for an assessment of the relative performance of a Medicaid ACO model focused on enhanced payment for care coordination and case management (Colorado) vs a more comprehensive Medicaid ACO model predicated on a global budget and downside financial risk (Oregon). Assessing these impacts is particularly salient when viewed in the context of a nationwide trend toward ACO models and the need for evidence on their ability to slow utilization and improve access, quality, and outcomes.

Methods

We used a difference-in-differences approach, with the more intensive Oregon CCO intervention serving as the treatment group and the Colorado Medicaid program serving as the comparison group for 2 years after the Medicaid reforms were implemented. This study was approved by the institutional review board at Oregon Health & Science University with waiver of informed consent.

Study Populations

We obtained data from each state’s Medicaid agency and analyzed claims for 18 months (July 1, 2010, through December 31, 2011) of preintervention data and 24 months of 2013-2014 postintervention data, treating 2012 as a transition year. Our primary analyses focused on the population of individuals who were enrolled in both the preintervention and postintervention periods for at least 3 months within a 12-month window. We excluded individuals who were dually eligible for both Medicare and Medicaid. In Oregon, we excluded Medicaid enrollees who were not enrolled in CCOs because of special health needs4 and Medicaid enrollees from 1 CCO (Cascade Health Alliance), which did not launch until August 2013.

The Colorado comparison group was restricted to Medicaid beneficiaries who were in the standard (non-ACC) Medicaid program in the 2010-2011 period but were covered by the ACC program for the 2013-2014 period. Children who were eligible for Medicaid through the State Children's Health Insurance Program were excluded because they were not eligible for the ACC. We excluded individuals enrolled in managed care because they were required to “opt out” of managed care into the ACC. Managed care penetration was low (<2%), with the exception of Denver and Mesa counties, which had substantially higher managed care penetration rates. To avoid potential selection bias, our analyses excluded residents of these 2 counties.

Propensity Score Weighting

We used propensity score weighting as a first step in adjusting for observable differences between the Oregon and Colorado groups. The propensity score variables included age, sex, rural residence, and Chronic Illness and Disability Payment System risk indicators.14 Propensity weights were applied across the Oregon and Colorado populations for all study periods, with each individual in each time period given a weight proportional to the probability of being in the Oregon Medicaid program in the fourth quarter of 2011, before the CCO intervention. This weighting approach adjusted for observable differences between the Oregon and Colorado populations as well as changes in the composition of each population over time.15 Additional details are provided in the eMethods in the Supplement.

Outcome Variables

In Oregon’s managed care and CCO environment, capitation and other alternative payment mechanisms result in encounter claims that include information on diagnosis and procedure but record paid amounts as zero. We created a composite measure of standardized expenditures that could be compared across states using the following steps. First, we identified the set of procedure codes and services that were common across both states and included as 1 of 4 categories of service in the Berenson-Eggers Type of Service classification.16 These services included evaluation and management, imaging, tests, and procedures. Next, we repriced these claims with standardized prices, using the Oregon 2014 Medicaid fee schedule to attach standardized prices to claims in both states according to procedure and site-of-service codes. We repriced inpatient facility services on a per diem basis. Additional details are provided in the eMethods in the Supplement. This approach creates a measure of standardized expenditures representing typical Medicaid expenditures for selected services across both states.

Utilization measures included emergency department (ED) visits, primary care visits, and acute inpatient days. To further investigate changes in access, we constructed measures from the Healthcare Effectiveness Data and Information Set (HEDIS)17: well-child care visits in the third, fourth, fifth, and sixth years of life; children’s and adolescents’ access to preventive and ambulatory health services (members aged 1-6 years who had an ambulatory or preventive care visit during the year or 7-19 years who had an ambulatory or preventive care visit during the past 2 years); adolescent well-care visits (members 12-21 years who had at least 1 comprehensive well-care visit during the year); and the percentage of adults 20 to 44 years who had an ambulatory or preventive care visit during the year. We also analyzed performance on 4 measures of appropriateness or low-value care (ie, appropriate medications for individuals with asthma, testing for children with pharyngitis, imaging studies for low back pain, and imaging studies for uncomplicated headache), hypothesizing that these services might be areas of focus for organizations seeking to reduce spending and improve quality.18 Finally, we assessed changes in quality by estimating alterations in potentially avoidable ED visits19 and preventable hospitalizations as defined by the Agency for Healthcare Research and Quality Prevention Quality Indicators (PQI).20 Following the methods of Joynt and colleagues,21 we did not include admission source as a variable in our PQI algorithm.

Statistical Analysis

We used standardized differences to assess the comparability of the Medicaid population and the propensity-weighted comparison group.22,23 Propensity score–weighted linear models were created to assess changes in expenditures, utilization, access measures, preventable ED visits and hospitalizations, and the provision of low-value care.

The covariates included age; sex; Chronic Illness and Disability Payment System risk indicators; rural residence indicators; an indicator for individuals in Oregon; indicator variables for the second, third, and fourth quarters of the year (to control for seasonality); an indicator for the postintervention period (2013-2014); and the interaction between the Oregon population and postintervention indicators, which produced estimates of the policy effects. We tested the assumption of parallel trends in the treatment and comparison groups for utilization measures in the preintervention period.24,25 Measures of access, low-value care, and PQI required 1-year lookbacks and were restricted to continuously enrolled individuals with annual assessments in 2011, 2013, and 2014. Standard errors were adjusted for clustering at the primary care service area level.26,27

Data management and analyses were conducted using R statistical software, version 3.1.2 (R Core Team) and Stata software, version 14 (StataCorp). Stata code and output for main analyses are provided in the eAppendix in the Supplement.

Sensitivity Analyses

We conducted multiple analyses to assess the sensitivity of propensity score specifications, transition period, study population definitions, and other assumptions. These analyses and results are described in the eMethods and eTables 1-3 in the Supplement.

Results

Table 1 reports the delivery system and reform components of the Oregon and Colorado programs. The reforms were similar in their regional focus and emphasis on primary care, but the Oregon program was more comprehensive in its scope of benefits covered as well as its use of global budgets and downside financial risk as a mechanism for cost control. The substantial Centers for Medicare & Medicaid Services investment in Oregon provided funding for administrative staff, data infrastructure, and resources for implementation, training, and related services, and insured that the transformation efforts would not be hampered by reductions in reimbursement rates.

There were 452 371 Oregon Medicaid enrollees and 330 511 Colorado Medicaid enrollees included in the analyses. A total of 45.0% were male, with mean (SD) age 16.74 (14.41) years. After propensity score weighting, differences in enrollee clinical and demographic characteristics were small, although the propensity-weighted Colorado group was slightly younger than the Oregon group (Table 2). Investigation of the preintervention parallel trends assumption indicated no significant differences in quarterly trends across most expenditure and utilization measures,24,28 with exceptions for standardized expenditures for procedures and the primary care visit utilization measure (eTable 1 and eTable 2 in the Supplement).

Standardized expenditures decreased in Oregon compared with Colorado, but after adjusting for demographics and health risk, there was no significant difference ($2.00; 95% CI, −$0.79 to $4.78) in per member per month standardized expenditures for Oregon’s Medicaid enrollees; positive values indicate higher growth in standardized expenditures in Oregon compared with Colorado (Table 3). In general, performance on standardized expenditures for most measures was similar in the first and second years, with some exceptions. For example, compared with Colorado, standardized expenditures for inpatient services were significantly higher for Oregon in the second year of implementation ($4.37; 95% CI, $0.01 to $8.73).

Table 4 displays differences in standardized expenditure and utilization measures stratified by adults and children. Compared with Colorado, Oregon’s growth in overall standardized expenditures was lower for adults than children, but neither group showed statistically significant differences between the states. Point estimates of the pooled analyses in Table 3 do not necessarily reflect the weighted estimates of Table 4’s stratified analyses, in part because Table 4 excludes individuals transitioning to adults over the study period and different propensity score weights were used for each analysis. Patterns were generally similar across metrics for both children and adults, with some exceptions. For example, decreases in ED visits for Oregon compared with Colorado were statistically significant for adults but not children.

Table 5 displays measures of access, avoidable ED visits, preventable hospitalizations (PQIs), and measures of low-value care. Although primary care utilization decreased across both states, Oregon maintained or improved care in 3 of 4 measures of access (well-child care visits for children 3-6 years: 2.7%; 95% CI, 1.2% to 4.2%; adolescent well-care visits: 6.8%; 95% CI, 5.2% to 8.3%; adult access to preventive ambulatory care: 1.3%; 95% CI, 0.3% to 2.2%) compared with Colorado. Oregon also improved on measures of avoidable ED visits, decreasing by 1.8 per 1000 member-months (95% CI, −3.1 to −0.4), as well as acute PQI preventable hospitalizations (−1.0 per 1000 member-months; 95% CI, −1.6 to −0.4). Compared with Colorado, Oregon’s CCO transformation was not associated with statistically significant improvements in 3 of 4 measures of low-value care. However, avoidance of imaging for uncomplicated headache improved by 2.6% (95% CI, 1.4% to 3.8%) compared with Colorado.

Our results were robust to sensitivity analyses, with some exceptions. For example, Oregon exhibited statistically significantly higher standardized expenditures and inpatient utilization than Colorado in some specifications, and the reduction in primary care visits observed in Oregon was not statistically significant in other models (eTables 1 and 2 in the Supplement).

Discussion

Oregon’s and Colorado’s reforms represent 2 early efforts to implement Medicaid ACOs. Compared with Colorado, the Oregon CCO model was not associated with reductions in standardized expenditures for selected services in the first 2 years after implementation, although utilization for ED and primary care visits was significantly lower. Trends were similar among adults and children. Our results were generally consistent across a series of sensitivity analyses (eTables 1 and 2 in the Supplement).

Compared with Colorado, Oregon’s CCO transformation was associated with improvements in 3 of 4 HEDIS access measures, reductions in avoidable ED visits, and preventable acute hospital admissions. However, in other areas, Colorado performed as well as or better than Oregon. Inpatient care days, a potentially expensive service area, declined in both states, and in some specifications, reductions in Colorado were significantly greater than those in Oregon (eTable 1 in the Supplement).

Although the Oregon and Colorado Medicaid ACO programs emphasized primary care homes, primary care visits decreased in the study populations for both states and were significantly lower in Oregon than Colorado in 2014. These observed decreases may reflect a lack of primary care capacity attributable to the 2014 Affordable Care Act Medicaid expansion, wherein both states increased their Medicaid enrollment substantially: Colorado increased its Medicaid enrollment by 41% by July 2014, whereas Oregon increased its enrollment by 59%, which was the second largest increase in the country.29 Reductions in primary care visits should be monitored closely and may be a cause for concern if they reflect restricted access. Alternatively, a reduction in primary care visits could represent substitutions toward case management. Oregon’s reduction in primary care visits was accompanied by relative or absolute improvements in most HEDIS access measures, suggesting the potential for a more efficient reconfiguration of primary care resulting in fewer visits but maintaining access.30

More than 2 years into their programs, both states can point to successes. In the 2011-2014 timespan, both states demonstrated reductions in measures of standardized expenditures and utilization. Compared with Colorado, Oregon experienced improvements in some access and quality measures, but did not generate savings that might be anticipated with its ambitious reform model and the $1.9 billion federal investment to support the CCO transformation.4,9 There are a few possible reasons why greater savings were not achieved. First, CCOs may need more time to fully implement changes that translate to greater savings. Second, spending not only slowed, it was reduced in both states during the study period. There may be limits to the extent to which relative savings can be achieved in a period of shrinking (as opposed to growing) health care spending. Furthermore, although Colorado did not have the benefit of a similar investment from the federal government, its ACC model has had apparent success. Its focus on manageable, incremental steps has been followed by growth in enrollment, reductions in utilization, and improvement in some key performance indicators.12 From this vantage point, Colorado’s approach may represent a promising delivery system reform that may be more feasible for other states to adopt, in comparison with the larger scope of the model pursued by Oregon.

Limitations

Our study has important limitations. The main outcome—standardized expenditures—focused on a narrow set of services with common codes across states. Estimates of total per capita Medicaid spending published by the Oregon Health Authority11 suggest that our measure of standardized expenditures accounts for approximately 42% of total spending on medical services. Our analysis did not include expenditures on prescription drugs, which is a growing portion of Medicaid spending. Thus, we did not test for differences in overall expenditures. It is possible, for example, that estimated savings for Oregon could be reversed if Oregon’s expenditures on other services grew at a rate faster than Colorado’s. Furthermore, although our use of standardized expenditures allowed us to attach prices to managed care–type encounter claims and to ensure consistency across states, it may have obscured reductions in spending that could have arisen through changes in overall reimbursement rates or in the intensity of inpatient services.

Our findings should also be interpreted in the light of other large changes occurring in both states. Neither state represents a counterfactual “business as usual.” Nonetheless, our results are still useful in guiding expectations about Medicaid reforms. Furthermore, the lack of differences in preintervention trends between the states for most measures, coupled with a large number of sensitivity analyses, improves the fidelity and reliability of our findings.

This evaluation should also be viewed in terms of the broader trends in health care utilization. Our findings of slowed or reduced utilization in the 2010-2014 period may not be entirely attributable to the Medicaid reforms in Colorado and Oregon and may instead correspond to a period of historically low national health spending growth. National per capita Medicaid acute care increased at an average of 3.7% in the 2008-2011 period, but decreased by 1.7% in the 2011-2012 period.31 Nonetheless, recent evidence suggests a resurgence in overall health care spending growth, rising from a growth rate of 2.9% in 2013 to 5.3% in 2014.32 Given these trends, restraining utilization in future years may require additional effort from the Oregon and Colorado Medicaid ACO models.

Conclusions

A wide variety of Medicaid reforms is under way in the United States. Some states have emphasized a greater role in patient responsibility through the imposition of co-payments or health savings accounts, while others have emphasized the delivery system as a path toward a higher quality and financially sustainable public insurance program. Our study of 2 years of postintervention data from Medicaid ACO reforms found relative performance improvements in several aspects of care in Oregon compared with Colorado, but identified no significant differences in standardized expenditures for selected services. These results should be considered in the context of overall promising trends in both states. Continued evaluation of Medicaid reforms and payment models can inform the most effective approaches to improving and sustaining the value of this growing public program.

Back to top
Article Information

Corresponding Author: K. John McConnell, PhD, Center for Health Systems Effectiveness, Oregon Health & Science University, 3181 SW Sam Jackson Park Rd, Mail Code MDYCHSE, Portland, OR 97239 (mcconnjo@ohsu.edu).

Accepted for Publication: September 14, 2016.

Published Online: February 13, 2017. doi:10.1001/jamainternmed.2016.9098

Author Contributions: Dr McConnell had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: McConnell, Renfro, Mendelson, Cohen, McCarty, Wallace, Lindrooth.

Acquisition, analysis, or interpretation of data: McConnell, Renfro, Chan, Meath, Mendelson, Waxmonsky, Wallace, Lindrooth.

Drafting of the manuscript: McConnell, Chan, Mendelson.

Critical revision of the manuscript for important intellectual content: McConnell, Renfro, Meath, Mendelson, Cohen, Waxmonsky, McCarty, Lindrooth, Wallace.

Statistical analysis: McConnell, Renfro, Wallace, Lindrooth.

Obtained funding: McConnell, McCarty, Lindrooth.

Administrative, technical, or material support: McConnell, Renfro, Chan, Meath, Mendelson, Cohen, Waxmonsky, McCarty.

Study supervision: McConnell, Waxmonsky.

Conflict of Interest Disclosures: None reported.

Funding/Support: This research was funded by grant 1R01MH1000001 from the National Institutes of Health (NIH) Common Fund Health Economics Program and the Silver Family Foundation and by NIH grant R33 DA035640.

Role of the Funder/Sponsor: The funding agencies had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

References
1.
National Association of State Budget Officers. State Expenditure Report: examining fiscal 2012-2014 state spending. Washington DC: National Association of State Budget Officers. https://higherlogicdownload.s3.amazonaws.com/NASBO/9d2d2db1-c943-4f1b-b750-0fca152d64c2/UploadedImages/SER_Archive/State_Expenditure_Report_Fiscal_2012_2014_S.pdf. Published 2014. Accessed January 7, 2016.
2.
Kocot  SL, Dang-Vu  C, White  R, McClellan  M.  Early experiences with accountable care in Medicaid: special challenges, big opportunities.  Popul Health Manag. 2013;16(suppl 1):S4-S11.PubMedGoogle ScholarCrossref
3.
Center for Health Care Strategies Inc. Medicaid Accountable Care Organizations: state update. http://www.chcs.org/resource/medicaid-accountable-care-organizations-state-update/. Accessed August 15, 2016.
4.
McConnell  KJ, Chang  AM, Cohen  DJ,  et al.  Oregon’s Medicaid transformation: an innovative approach to holding a health system accountable for spending growth.  Healthc (Amst). 2014;2(3):163-167.PubMedGoogle ScholarCrossref
5.
Stecker  EC.  The Oregon ACO experiment—bold design, challenging execution.  N Engl J Med. 2013;368(11):982-985.PubMedGoogle ScholarCrossref
6.
Howard  SW, Bernell  SL, Yoon  J, Luck  J.  Oregon’s coordinated care organizations: a promising and practical reform model.  J Health Polit Policy Law. 2014;39(4):933-940.PubMedGoogle ScholarCrossref
7.
Pollack  HA.  Oregon’s coordinated care organizations.  J Health Polit Policy Law. 2014;39(4):929-931.PubMedGoogle ScholarCrossref
8.
Chang  AM, Cohen  DJ, McCarty  D, Rieckmann  T, McConnell  KJ.  Oregon’s Medicaid transformation—observations on organizational structure and strategy.  J Health Polit Policy Law. 2015;40(1):257-264.PubMedGoogle ScholarCrossref
9.
McConnell  KJ.  Oregon’s Medicaid Coordinated Care Organizations.  JAMA. 2016;315(9):869-870.PubMedGoogle ScholarCrossref
10.
Coughlin  TA, Corlette  S.  ACA implementation—monitoring and tracking; Oregon: site visit report. Urban Institute. http://www.urban.org/UploadedPDF/412498-ACA-Implementation-Monitoring-and-Tracking-Oregon-Site-Visit-Report.pdf. Published 2012. Accessed June 14, 2015.
11.
Oregon Health Authority. Oregon’s health system transformation: 2014 final report. http://www.oregon.gov/oha/Metrics/Documents/2014 Final Report-June 2015.pdf. Published June 24, 2015. Accessed November 1, 2015.
12.
Colorado Department of Health Care Policy and Financing. Creating a culture of change: accountable care collaborative, 2014 annual report. https://www.colorado.gov/pacific/sites/default/files/Accountable Care Collaborative 2014 Annual Report.pdf. Accessed March 13, 2016.
13.
Lindrooth  RC, Tung  G, Santos  T, O’Leary  S; Evaluation of the Accountable Care Collaborative: year 1 report. https://www.colorado.gov/pacific/sites/default/files/Supporting a Culture of Coverage Accountable Care Collaborative 2014-15 Annual Report.pdf. Accessed August 15, 2016.
14.
 Chronic Illness and Disability Payment System, Version 5.3. San Diego, CA: University of California; 2011.
15.
Stuart  EA, Huskamp  HA, Duckworth  K,  et al.  Using propensity scores in difference-in-differences models to estimate the effects of a policy change.  Health Serv Outcomes Res Methodol. 2014;14(4):166-182.PubMedGoogle ScholarCrossref
16.
Centers for Medicare & Medicaid Services. Berenson-Eggers Type of Service. https://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/MedicareFeeforSvcPartsAB/downloads/betosdesccodes.pdf. Published 2015. Accessed March 14, 2015.
17.
National Committee for Quality Assurance. Technical Specifications for Health Plans, Volume 2. Washington, DC: National Committee for Quality Assurance, HEDIS; 2014.
18.
Schwartz  AL, Chernew  ME, Landon  BE, McWilliams  JM.  Changes in low-value services in year 1 of the Medicare pioneer Accountable Care Organization program.  JAMA Intern Med. 2015;175(11):1815-1825.PubMedGoogle ScholarCrossref
19.
Medi-Cal Managed Care Division, California Department of Health Care Services. Statewide collaborative quality improvement project: reducing avoidable emergency room visits. http://www.dhcs.ca.gov/dataandstats/reports/Documents/MMCD_Qual_Rpts/EQRO_QIPs/CA2011-12_QIP_Coll_ER_Remeasure_Report.pdf. Published June 2012. Accessed February 10, 2015.
20.
Agency for Healthcare Research and Quality. Prevention Quality Indicators overview. http://www.qualityindicators.ahrq.gov/modules/pqi_overview.aspx. Accessed February 21, 2105.
21.
Joynt  KE, Gawande  AA, Orav  EJ, Jha  AK.  Contribution of preventable acute care spending to total spending for high-cost Medicare patients.  JAMA. 2013;309(24):2572-2578.PubMedGoogle ScholarCrossref
22.
Austin  PC.  A critical appraisal of propensity-score matching in the medical literature between 1996 and 2003.  Stat Med. 2008;27(12):2037-2049.PubMedGoogle ScholarCrossref
23.
Austin  PC.  Balance diagnostics for comparing the distribution of baseline covariates between treatment groups in propensity-score matched samples.  Stat Med. 2009;28(25):3083-3107.PubMedGoogle ScholarCrossref
24.
Ryan  AM, Burgess  JF  Jr, Dimick  JB.  Why we should not be indifferent to specification choices for difference-in-differences.  Health Serv Res. 2015;50(4):1211-1235.PubMedGoogle ScholarCrossref
25.
Angrist  JD, Pirschke  J-S.  Mostly Harmless Econometrics: An Empiricist’s Companion. Princeton, NJ: Princeton University Press; 2008.
26.
The Dartmouth Atlas of Healthcare. Dartmouth Atlas primary care service area data. http://www.dartmouthatlas.org/tools/downloads.aspx?tab=42. Accessed June 10, 2015.
27.
Bertrand  M, Duflo  E, Mullainathan  S.  How much should we trust differences-in-differences estimates?  Q J Econ. 2004;119(1):249-275.Google ScholarCrossref
28.
Osborne  NH, Nicholas  LH, Ryan  AM, Thumma  JR, Dimick  JB.  Association of hospital participation in a quality reporting program with surgical outcomes and expenditures for Medicare beneficiaries.  JAMA. 2015;313(5):496-504.PubMedGoogle ScholarCrossref
29.
The Henry J. Kaiser Family Foundation. Total monthly Medicaid and CHIP enrollment. http://kff.org/health-reform/state-indicator/total-monthly-medicaid-and-chip-enrollment/. Updated 2017. Accessed March 30, 2016.
30.
Dale  SB, Ghosh  A, Peikes  DN,  et al.  Two-year costs and quality in the comprehensive primary care initiative.  N Engl J Med. 2016;374(24):2345-2356.PubMedGoogle ScholarCrossref
31.
Young  K, Clemans-Cope  L, Lawton  E, Holahan  J. Medicaid spending growth in the great recession and its aftermath, FY 2007-2012. The Kaiser Foundation. https://kaiserfamilyfoundation.files.wordpress.com/2014/07/8309-03-medicaid-spending-growth-in-the-great-recession-and-its-aftermath-fy-2007-2012.pdf. Accessed July 11, 2014.
32.
Martin  AB, Hartman  M, Benson  J, Catlin  A; National Health Expenditure Accounts Team.  National health spending in 2014: faster growth driven by coverage expansion and prescription drug spending.  Health Aff (Millwood). 2016;35(1):150-160.PubMedGoogle ScholarCrossref
×