[Skip to Navigation]
Sign In
Figure 1.  Flow Diagram for Creation of the Maintenance of Certification (MOC)–Required and MOC-Grandfathered Beneficiary Cohorts
Flow Diagram for Creation of the Maintenance of Certification (MOC)–Required and MOC-Grandfathered Beneficiary Cohorts

aExcludes general internists who subspecialized after their internal medicine certification.
bClaims less than or equal to $0.00, had charges that were not allowed by Medicare for processing, or had missing UPINs were excluded from the count.
cBeneficiary counts per MOC-required and MOC-grandfathered general internist groups are not mutually exclusive and so do not sum to the total because one beneficiary may have a contact with a general internist in each group.
dIndividual exclusion criteria are not mutually exclusive.

Figure 2.  Flow Diagram for Creation of the Beneficiary Sample Attributed to a Primary Care Physician
Flow Diagram for Creation of the Beneficiary Sample Attributed to a Primary Care Physician

MOC indicates Maintenance of Certification. From Figure 1, 212 physicians were excluded because they had at least 1 ambulatory visit with a beneficiary between 1999 and 2005, but no attributed beneficiaries in either 1999, 2000, or 2001. A 3-year window was used in the attribution for constructing the beneficiary panel: for pre-MOC, 1999 to 2001; for post-MOC, 2001-2003.
a
Beneficiary counts per MOC-required and MOC-grandfathered general internist groups are not mutually exclusive and so do not sum to the total because one beneficiary may have a contact with a general internist in each group.
bTotal beneficiaries are after propensity score matching.

Figure 3.  Ambulatory Care–Sensitive Hospitalizations and Health Care Costs by Maintenance of Certification (MOC) Cohort and Across Years
Ambulatory Care–Sensitive Hospitalizations and Health Care Costs by Maintenance of Certification (MOC) Cohort and Across Years
Table 1.  Beneficiary Cohort Characteristics at Baseline (1999-2000)
Beneficiary Cohort Characteristics at Baseline (1999-2000)
Table 2.  Beneficiary Cohort Health Care Measures at Baseline (1999-2000)
Beneficiary Cohort Health Care Measures at Baseline (1999-2000)
Table 3.  Regression−Based Estimates of Associations Between the Maintenance of Certification (MOC) Requirement and Differences in Outcome Growth Across Beneficiary Cohorts and MOC Periods
Regression−Based Estimates of Associations Between the Maintenance of Certification (MOC) Requirement and Differences in Outcome Growth Across Beneficiary Cohorts and MOC Periods
Table 4.  Regression-Based Estimates of Associations Between the Maintenance of Certification (MOC) Requirement and Changes in Outcome Across Cohorts for Each Post-MOC Year vs the Pre-MOC Period
Regression-Based Estimates of Associations Between the Maintenance of Certification (MOC) Requirement and Changes in Outcome Across Cohorts for Each Post-MOC Year vs the Pre-MOC Period
Audio Author Interview (8:28)
1x
0:00 / 0:00

Physicians generally recognize the need for maintenance of certification (MOC), but recent changes in American Board of Internal Medicine MOC requirements have led some to question whether the process is relevant to contemporary clinical practice or meaningful as a measure of physician and healthcare quality. To date, disagreements about MOC have been based more on principle than evidence. But two recent landmark research articles in JAMA have shed light on the relationship between MOC and measures relevant to patients and physicians. We invite you to listen as JAMA editors, authors, and distinguished panelists discuss what these new data might mean for patients and physicians, the professionalism and health care quality movements, and the debate over whether it’s possible to identify the ‘good’ physician.

Supplement.

eMethods 1. Methods for creation of the physicians-patient sample

eMethods 2. Description of contact specialty type and visit type criteria for attribution

eMethods 3. Description of ambulatory care sensitive hospitalizations

eMethods 4. Median values and incidence for each dependent measure the pre-MOC (from 1999 to 2000) and post-MOC (from 2002 to 2005) periods

eMethods 5. Demographic characteristics of the MOC-required and MOC-grandfathered physician cohorts

eMethods 6. Beneficiary attribution methods and sensitivities

eMethods 7. Estimation specification sensitivities

eMethods 8. Inclusion of post-acute care costs

eMethods 9. Test of pre-period trend differences

eMethods 10. Test of differential changes in trend between the MOC-required and MOC-grandfathered cohorts in the post-MOC period across dependent measures

eMethods 11. Unadjusted dependent measures by year and MOC group

eMethods 12. Definition of chronic condition indicators

eMethods 13. List of variables used in the propensity matching models

eMethods 14. Definition of Spending Categories

eTable 1. Conditions that comprise ambulatory care sensitive hospitalizations

eTable 2. Median values for dependent measures in the Pre MOC (1999-2000) and post-MOC (2002-2005) periods for the MOC-required and MOC-grandfathered beneficiary panels

eTable 3. Number of beneficiary-year observations were we observed any event or cost greater than zero in the Pre MOC (1999-2000) and post-MOC (2002-2005) periods for the MOC-required and MOC-grandfathered beneficiary panels

eTable 4. Demographic characteristics of the MOC-required and MOC-grandfathered physicians in the attributed beneficiary cohorts

eTable 5. Attribution method sensitivity test results

eTable 6. Estimation specification sensitivity test results

eTable 7. Test of pre-period trends across dependent measures

eTable 8. Test of pre-period trends across total costs and ambulatory costs utilizing data compiled quarterly

eTable 9. Test of differential changes in trend between the MOC-required and MOC-grandfathered cohorts in the post-MOC period across dependent measures

eTable 10. Test of differential changes in trend between the MOC-required and MOC-grandfathered cohorts in the post-MOC period across dependent measures utilizing data compiled quarterly

eTable 11. Yearly differential change associated with MOC for each year in the post period vs the pre-period

eTable 12. Individual HCC indicators within each HCC category

eTable 13. Classification of costs based on BETOS codes

eFigure 1. Any ambulatory care sensitive hospitalization unadjusted yearly means

eFigure 2. Acute ambulatory care sensitive hospitalization unadjusted yearly means

eFigure 3. Chronic ambulatory care sensitive hospitalization unadjusted yearly means

eFigure 4. Annual incidence of a hospitalization unadjusted yearly means

eFigure 5. Annual incidence of an emergency department visit unadjusted yearly means

eFigure 6a. Total costs unadjusted yearly means

eFigure 6b. Total costs unadjusted quarterly means

eFigure 7a. Ambulatory costs unadjusted yearly means

eFigure 7b. Ambulatory costs unadjusted quarterly means

eFigure 8. Inpatient costs unadjusted yearly means

eFigure 9. Major procedure costs unadjusted yearly means

eFigure 10. Minor procedures and endoscopy costs unadjusted yearly means

eFigure 11. Imaging costs unadjusted yearly means

eFigure 12. Laboratory testing costs unadjusted yearly means

eFigure 13. Testing aggregate (imaging and laboratory tests) unadjusted yearly means

eFigure 14. Specialty visit costs unadjusted yearly means

eFigure 15. Nonspecialty visit costs unadjusted yearly means

eReferences

1.
Miller  SH.  American Board of Medical Specialties and repositioning for excellence in lifelong learning: Maintenance of Certification.  J Contin Educ Health Prof. 2005;25(3):151-156.PubMedGoogle ScholarCrossref
2.
Levinson  W, King  TE  Jr, Goldman  L, Goroll  AH, Kessler  B.  Clinical decisions: American Board of Internal Medicine Maintenance of Certification program.  N Engl J Med. 2010;362(10):948-952.PubMedGoogle ScholarCrossref
3.
Norcini  JJ, Lipner  RS, Kimball  HR.  Certifying examination performance and patient outcomes following acute myocardial infarction.  Med Educ. 2002;36(9):853-859.PubMedGoogle ScholarCrossref
4.
Pham  HH, Schrag  D, Hargraves  JL, Bach  PB.  Delivery of preventive services to older adults by primary care physicians.  JAMA. 2005;294(4):473-481.PubMedGoogle ScholarCrossref
5.
Holmboe  ES, Wang  Y, Meehan  TP,  et al.  Association between Maintenance of Certification examination scores and quality of care for Medicare beneficiaries.  Arch Intern Med. 2008;168(13):1396-1403.PubMedGoogle ScholarCrossref
6.
Sirovich  BE, Lipner  RS, Johnston  M, Holmboe  ES.  The association between residency training and internists’ ability to practice conservatively [published online September 1, 2014].  JAMA Intern Med. 2014;174(10):1640-1648. PubMedGoogle ScholarCrossref
7.
Austin  PC.  Optimal caliper widths for propensity-score matching when estimating differences in means and differences in proportions in observational studies.  Pharm Stat. 2011;10(2):150-161.PubMedGoogle ScholarCrossref
8.
Austin  PC.  An introduction to propensity score methods for reducing the effects of confounding in observational studies.  Multivariate Behav Res. 2011;46(3):399-424.PubMedGoogle ScholarCrossref
9.
Agency for Healthcare Research and Quality.  Quality indicator user guide: prevention quality indicators (PQI) composite measures.http://www.qualityindicators.ahrq.gov/Downloads/Modules/PQI/V44/Composite_User_Technical_Specification_PQI_V4.4.pdf. Accessed August 12, 2014.
10.
McWilliams  JM, Landon  BE, Chernew  ME.  Changes in health care spending and quality for Medicare beneficiaries associated with a commercial ACO contract.  JAMA. 2013;310(8):829-836.PubMedGoogle ScholarCrossref
11.
US Bureau of Labor Statistics.  Consumer price index, all urban consumers.http://www.bls.gov/cpi/#tables. Accessed July 1, 2013.
12.
Alemayehu  B, Warner  KE.  The lifetime distribution of health care costs.  Health Serv Res. 2004;39(3):627-642.PubMedGoogle ScholarCrossref
13.
Meerding  WJ, Bonneux  L, Polder  JJ, Koopmanschap  MA, van der Maas  PJ.  Demographic and epidemiological determinants of healthcare costs in Netherlands: cost of illness study.  BMJ. 1998;317(7151):111-115.PubMedGoogle ScholarCrossref
14.
Pope  G, Kautter  J, Ingber  M, Freeman  S, Sekar  R, Newhart  C.  Evaluation of the CMS-HCC risk adjustment model: final report.https://www.cms.gov/Medicare/Health-Plans/MedicareAdvtgSpecRateStats/downloads/Evaluation_Risk_Adj_Model_2011.pdf. Accessed May 12, 2014.
15.
StataCorp.  Stata Statistical Software: Release 13. http://www.stata.com/manuals13/me.pdf. Accessed October 16, 2014.
16.
Friedberg  MW, Schneider  EC, Rosenthal  MB, Volpp  KG, Werner  RM.  Association between participation in a multipayer medical home intervention and changes in quality, utilization, and costs of care.  JAMA. 2014;311(8):815-825.PubMedGoogle ScholarCrossref
17.
Colla  CH, Wennberg  DE, Meara  E,  et al.  Spending differences associated with the Medicare Physician Group Practice Demonstration.  JAMA. 2012;308(10):1015-1023.PubMedGoogle ScholarCrossref
18.
Song  Z, Safran  DG, Landon  BE,  et al.  Health care spending and quality in year 1 of the alternative quality contract.  N Engl J Med. 2011;365(10):909-918.PubMedGoogle ScholarCrossref
19.
Kritek  PA, Drazen  JM.  Clinical decisions: American Board of Internal Medicine Maintenance of Certification program—polling results.  N Engl J Med. 2010;362(15):e54.PubMedGoogle ScholarCrossref
20.
Goldman  L, Goroll  A, Kessler  B.  Do not enroll in the current MOC program.  N Engl J Med. 2010;362:950-952. doi:10.1056/NEJMclde0911205.Google Scholar
21.
Iglehart  JK, Baron  RB.  Ensuring physicians’ competence—is Maintenance of Certification the answer?  N Engl J Med. 2012;367(26):2543-2549.PubMedGoogle ScholarCrossref
22.
Hawkins  RE, Lipner  RS, Ham  HP, Wagner  R, Holmboe  ES.  American Board of Medical Specialties Maintenance of Certification: theory and evidence regarding the current framework.  J Contin Educ Health Prof. 2013;33(S1)(suppl 1):S7-S19.PubMedGoogle ScholarCrossref
23.
Pham  HH, Landon  BE, Reschovsky  JD, Wu  B, Schrag  D.  Rapidity and modality of imaging for acute low back pain in elderly patients.  Arch Intern Med. 2009;169(10):972-981.PubMedGoogle ScholarCrossref
24.
Starfield  B, Powe  NR, Weiner  JR,  et al.  Costs vs quality in different types of primary care settings.  JAMA. 1994;272(24):1903-1908.PubMedGoogle ScholarCrossref
25.
Norcini  J, Anderson  B, Bollela  V,  et al.  Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference.  Med Teach. 2011;33(3):206-214.PubMedGoogle ScholarCrossref
26.
Petterson  SM, Liaw  WR, Phillips  RL  Jr, Rabin  DL, Meyers  DS, Bazemore  AW.  Projecting US primary care physician workforce needs: 2010-2025.  Ann Fam Med. 2012;10(6):503-509.PubMedGoogle ScholarCrossref
27.
The Kaiser Family Foundation’s State Health Facts.  Data source: CMS state/county penetration file: total number of Medicare beneficiaries 2012.http://kff.org/medicare/state-indicator/total-medicare-beneficiaries/. Accessed August 12, 2014.
28.
Medicare Payment Advisory Commission.  A data book: health care spending and the Medicare program.http://www.medpac.gov/-documents-/data-book. Accessed August 12, 2014.
29.
American Board of Internal Medicine.  Where does the money go?http://www.abim.org/pdf/revenue-expenses.pdf. Accessed August 14, 2014.
30.
Davies  S, McDonald  KM, Schmidt  E, Schultz  E, Geppert  J, Romano  PS.  Expanding the uses of AHRQ’s prevention quality indicators: validity from the clinician perspective.  Med Care. 2011;49(8):679-685.PubMedGoogle ScholarCrossref
31.
Landon  BE, O’Malley  AJ, McKellar  MR, Hadley  J, Reschovsky  JD.  Higher practice intensity is associated with higher quality of care but more avoidable admissions for Medicare beneficiaries.  J Gen Intern Med. 2014;29(8):1188-1194.PubMedGoogle ScholarCrossref
32.
Zhan  C, Miller  MR, Wong  H, Meyer  GS.  The effects of HMO penetration on preventable hospitalizations.  Health Serv Res. 2004;39(2):345-361.PubMedGoogle ScholarCrossref
33.
Roos  LL, Walld  R, Uhanova  J, Bond  R.  Physician visits, hospitalizations, and socioeconomic status: ambulatory care–sensitive conditions in a Canadian setting.  Health Serv Res. 2005;40(4):1167-1185.PubMedGoogle ScholarCrossref
34.
American Board of Internal Medicine.  What changed in 2014?http://www.abim.org/maintenance-of-certification/faq-2014-program-changes/what-changed-in-2014.aspx. Accessed August 12, 2014.
3 Comments for this article
EXPAND ALL
The Start of A Conversation About MOC
Howard Bauchner, MD | Journal of the American Medical Association
Since the American Board of Internal Medicine changed their maintenance of certification (MOC) requirements early this year, the forms MOC takes and its meaning to contemporary clinical practice have come under intense scrutiny. Two studies in this issue of JAMA – this article and http://ja.ma/1vLM38u - suggest no association between MOC quality measures. Despite these findings, Dr. Thomas Lee suggests in an editorial (http://ja.ma/1vLM38) that physicians have a professional responsibility to have an MOC process in place. Do you agree? MOC has changed just as health care has. Is the MOC process any better or worse now than in the past? What should the future of MOC look like? We invite your comments on this and our other articles.
CONFLICT OF INTEREST: JAMA Editor in Chief; not currently participating in maintenance of certification activities.
READ MORE
The grandfathered internists had 20 percent more practice experience
David Louis Keller, MD, FACP | Independent Internist
The grandfathered internists in this study were certified in 1989, while the MOC'ed internists were certified in 1991. The study was conducted in 2001, so the grandfathered internists had 12 years in practice, while the MOC'ed internists had only 10 years in practice. The difference of 20 percent more practice experience could account for much of the difference between the two groups; for example, the fact that the more experienced internists spent more money on patient care might reflect their increased experience in billing Medicare and insurance companies, which is knowledge not taught during the MOC process.
CONFLICT OF INTEREST: None Reported
Study does not mention controls for physician residency training site and health care expenditures
T Tsai, MD | Independent
I am citing the JAMA study \"Spending Patterns in Region of Residency Training and Subsequent Expenditures for Care Provided by Practicing Physicians for Medicare Beneficiaries\": http://jama.jamanetwork.com/article.aspx?articleid=2020373which suggests that physicians who train in areas where health care expenditures are higher tend to carry those higher spending habits with them into practice. I listened to the webcast on 12/17 and this exact question was brought up to the lead author but unfortunately I think he misunderstood what was being asked as I did not hear an answer that at all addressed the question.This study does not cite the above article nor bring up the relation of residency training to health care spending in post-graduate practice. The lead author's misinterpretation of the question when it was raised on the webcast suggests a fundamental lack of understanding of the myriad variables in patient care and puts this study's foundation on uneven ground. It is not much of a logical leap to conclude that a disproportionate number of physicians from higher-spending residency locations in the grandfathered cohort of this study could account for the small difference in spending between the two groups. Without this issue being addressed, any conclusions taken from this study other than \"Null\" should be suspect.
CONFLICT OF INTEREST: None Reported
READ MORE
Original Investigation
December 10, 2014

Association Between Imposition of a Maintenance of Certification Requirement and Ambulatory Care–Sensitive Hospitalizations and Health Care Costs

Author Affiliations
  • 1American Board of Internal Medicine, Philadelphia, Pennsylvania
  • 2James Madison University, Harrisonburg, Virginia
  • 3Mathematica Policy Research, Washington, DC
  • 4Accreditation Council for Graduate Medical Education, Chicago, Illinois
  • 5University of Minnesota, Minneapolis
JAMA. 2014;312(22):2348-2357. doi:10.1001/jama.2014.12716
Abstract

Importance  In 1990, the American Board of Internal Medicine (ABIM) ended lifelong certification by initiating a 10-year Maintenance of Certification (MOC) program that first took effect in 2000. Despite the importance of this change, there has been limited research examining associations between the MOC requirement and patient outcomes.

Objective  To measure associations between the original ABIM MOC requirement and outcomes of care.

Design, Setting, and Participants  Quasi-experimental comparison between outcomes for Medicare beneficiaries treated in 2001 by 2 groups of ABIM-certified internal medicine physicians (general internists). One group (n = 956), initially certified in 1991, was required to fulfill the MOC program in 2001 (MOC-required) and treated 84 215 beneficiaries in the sample; the other group (n = 974), initially certified in 1989, was grandfathered out of the MOC requirement (MOC-grandfathered) and treated 69 830 similar beneficiaries in the sample. We compared differences in outcomes for the beneficiary cohort treated by the MOC-required general internists before (1999-2000) and after (2002-2005) they were required to complete MOC, using the beneficiary cohort treated by the MOC-grandfathered general internists as the control.

Main Outcomes and Measures  Quality measures were ambulatory care–sensitive hospitalizations (ACSHs), measured using prevention quality indicators. Ambulatory care–sensitive hospitalizations are hospitalizations triggered by conditions thought to be potentially preventable through better access to and quality of outpatient care. Other outcomes included health care cost measures (adjusted to 2013 dollars).

Results  Annual incidence of ACSHs (per 1000 beneficiaries) increased from the pre-MOC period (37.9 for MOC-required beneficiaries vs 37.0 for MOC-grandfathered beneficiaries) to the post-MOC period (61.8 for MOC-required beneficiaries vs 61.4 for MOC-grandfathered beneficiaries) for both cohorts, as did annual per-beneficiary health care costs (pre-MOC period, $5157 for MOC-required beneficiaries vs $5133 for MOC-grandfathered beneficiaries; post-MOC period, $7633 for MOC-required beneficiaries vs $7793 for MOC-grandfathered beneficiaries). The MOC requirement was not statistically associated with cohort differences in the growth of the annual ACSH rate (per 1000 beneficiaries, 0.1 [95% CI, −1.7 to 1.9]; P = .92), but was associated with a cohort difference in the annual, per-beneficiary cost growth of −$167 (95% CI, −$270.5 to −$63.5; P = .002; 2.5% of overall mean cost).

Conclusion and Relevance  Imposition of the MOC requirement was not associated with a difference in the increase in ACSHs but was associated with a small reduction in the growth differences of costs for a cohort of Medicare beneficiaries.

Introduction

One of the largest changes in physician accreditation policy was the initiation of a 10-year Maintenance of Certification (MOC) requirement in 1990 by the American Board of Internal Medicine (ABIM). This change was also adopted by 24 certifying boards of the American Board of Medical Specialties (ABMS), affecting 85% of all US physicians.1 Despite the breadth of this policy change and considerable efforts by certifying boards to apply these requirements and by physicians to complete the requirements, to our knowledge, no studies have examined the association between this policy change and either the quality or efficiency of care.2 Previous studies showed positive relationships between initial certification and better health outcomes3,4 and between certification examination performance and quality5 and efficiency6 of care measures. Although these studies validated certification as a marker of competence, none have examined the consequences of the MOC requirement itself.

We addressed this gap by using a natural experiment in which 1 group of general internal medicine physicians (general internists) had to fulfill the 10-year MOC requirement and another group did not because they had originally certified 2 years earlier. Using outcomes for 2 cohorts of Medicare beneficiaries treated by these 2 groups 10 years after their original certification, we measured the association between the original MOC requirement and changes in ambulatory care–sensitive hospitalizations (ACSHs) and health care costs. In doing so, we tested the hypotheses that the MOC requirement was associated with higher-quality and more efficient care.

Methods
Physician and Patient Sample

The study protocol was approved by the Essex Institutional Review Board. The physician sample consisted of nonsubspecializing general internists originally certified in 1989 or 1991. The 1989 general internists were grandfathered out of the requirement (MOC-grandfathered) and the 1991 general internists were required to complete MOC by 2001 (MOC-required). We chose general internists who originally certified in 1991 rather than 1990 (when the requirement was first instituted) because the 1990 group—the first to be required to complete MOC—was less likely to enroll and complete MOC and, among those completing MOC, more likely to substantially delay their enrollment and completion. These problems were rectified in the 1991 group. We assumed that the 2-year difference in the initial certification year between the MOC-required and MOC-grandfathered general internists would not cause their delivery of health care services and types of beneficiaries treated to be different 10 years later.

Using Medicare claims, we drew the beneficiary sample from beneficiaries who had any billing contact with the general internist sample in 2001. To follow this cohort over time, we then obtained Medicare claims for these beneficiaries from 1999 to 2005 (the study period). We subsequently excluded beneficiaries who were younger than 65 years on January 1, 1999, enrolled in Medicare Advantage, or resided outside the United States during the study period to provide continuous claims data from 1999 through 2005. We then limited the sample to beneficiaries whose primary physician was either an MOC-required general internist (MOC-required beneficiary) or an MOC-grandfathered general internist (MOC-grandfathered beneficiary).

Determination of a potential primary physician was based on specialty types that typically render primary care services or manage chronic conditions (eMethods 2 in the Supplement). Beneficiaries were attributed to general internists who provided the plurality of the patient’s evaluation and management visits occurring in ambulatory care settings (ie, general internist with the most eligible visits). Because we were trying to identify changes in practice due to MOC, our attribution methodology was designed to identify beneficiaries who had an ongoing relationship with an ABIM general internist before and after 2001. For a beneficiary to be attributed to a general internist, the general internist needed to be their plurality general internist for at least 2 individual years in both the 1999-2001 period and the 2001-2003 period and be the plurality provider across each of these 3-year periods (eMethods 6 in the Supplement).

We balanced the characteristics of the attributed treatment (MOC-required) and control (MOC-grandfathered) beneficiary cohorts by matching propensity scores, which were constructed from estimating a logistic regression of the likelihood of being in the treatment cohort on baseline demographic characteristics, chronic conditions, and the characteristics of attributed general internists (eMethods 13 in the Supplement). We used nearest-neighbor matching within a 0.2 caliper without replacement.7,8 We retained all beneficiaries in the MOC-required cohort who matched at least 1 beneficiary in the MOC-grandfathered cohort to maintain the generalizability of our treated sample.

Outcome Measures

We used annual incidence of an ACSH as the primary quality measure, reported as the number per 1000 beneficiaries. Ambulatory care–sensitive hospitalizations are hospitalizations triggered by conditions thought to be potentially preventable through better access to and quality of outpatient care.9 For example, patients with diabetes are more likely to be hospitalized for diabetic complications if they are not adequately monitored or fail to receive patient education needed for self-management. Our ACSH measures were developed by the Agency for Healthcare Research and Quality and are also known as prevention quality indicators.9 We used 3 annual incidence measures of ACSHs: any ACSH, any chronic condition ACSH, and any acute condition ACSH. A list of conditions and procedures used to construct ACSH measures are included in the Supplement (eMethods 3 in the Supplement). Other service utilization measures included any hospitalization and any emergency department visit during a year.

The main cost measure was per-beneficiary annual health care costs, excluding post-acute health care costs (eMethods 8 in the Supplement; for sensitivity test including post-acute costs). Costs were also delineated by ambulatory and inpatient setting and by the following Berenson-Egger Type of Service (BETOS) code categories: major procedures, minor procedures, specialty office visits, nonspecialty office visits, imaging, and laboratory tests (eMethods 14 in the Supplement).10 All cost measures were adjusted to 2013 dollars using the US Bureau of Labor Statistics’ consumer price index for urban consumers.11

Statistical Analysis

We used a quasi-experimental design comparing outcomes during the pre-MOC (1999-2000) and post-MOC (2002-2005) periods for the MOC-required and MOC-grandfathered beneficiary cohorts. We estimated associations with the MOC requirement by comparing differences in outcome before vs after 2001 between these cohorts. Because we compared beneficiary outcome measures over time and these measures naturally increase as beneficiaries age,12,13 our estimates captured the association between growth in these outcomes and the imposition of the MOC requirement.

To construct estimates of associations with MOC, we regressed each outcome measure, at the beneficiary-year level, on an MOC-required beneficiary cohort indicator, year indicators, baseline (1999-2000) control variables (beneficiary age, sex, race, and 18 chronic condition indicators14), 10 location indicators (US Health and Human Services Regions), and general internist characteristics (sex, initial certifying examination performance). The regressions included an interaction between an indicator for being an MOC-required beneficiary and a post-MOC period indicator. The coefficient on this interaction yields the association between outcome differences between the MOC-required and MOC-grandfathered beneficiary cohorts before vs after imposition of the MOC requirement (2001), the estimate of the association with the MOC requirement.

We excluded 2001 from regressions because we assumed MOC-required general internists studied for and took the MOC examination and participated in other MOC requirements (eg, self-assessment of medical knowledge modules) and therefore may not have fully benefited from these activities throughout that year.

To examine the degree to which estimates of association with the MOC requirement changed over the post-MOC period, we applied separate interaction terms between each post-MOC year and the MOC-required indicator. These interactions measured the association between the differences in outcomes between the MOC-required and MOC-grandfathered cohorts before 2001 compared with each post-MOC year and tested whether the MOC associations differed over time. Analyses in the Supplement investigated the application of cohort interaction trends (eMethods 9 and eMethods 10 in the Supplement).

Considering the 3-level data structure with repeated beneficiary measures nested within general internists, we applied a hierarchical linear multivariable regression model (linear mixed model, StataCorp) with general internist and beneficiary random effects to estimate these relationships (eg, subject-specific effects among beneficiaries).15 This methodology is consistent with that applied in similar recent studies.10,16-18 All statistical tests were 2-sided and an α of .05 denoted statistical significance.

We conducted a specification sensitivity analysis (eMethods 7 in the Supplement). Because this model does not account for the dichotomous nature of the incidence measures or the skewed nature of the cost measures, we applied a nonlinear hierarchical model using logit models for dichotomous outcome measures and a generalized linear model with log link and γ distribution for cost models (multilevel mixed-effects generalized linear model, StataCorp) as a sensitivity analysis.15 To consider bias associated with unobserved non–time-varying beneficiary and general internist characteristics not captured in our main model, we also used a panel fixed-effects model that applies the equivalent of beneficiary and general internist indicator variables as controls (XTREG, StataCorp) with beneficiary fixed-effects.15 Other specification sensitivities included adding 2001 as a post-MOC period to consider the possibility that changes in practice affected outcomes during 2001 and, for total costs and ambulatory costs, changing the observation unit from yearly to quarterly.

Results

As depicted in Figure 1 and Figure 2, the original sample included 4419 general internists. After exclusions, there were 543 801 beneficiaries who had an ambulatory visit with 2699 ABIM general internists between 1999 and 2005 (the study period). The attribution criteria were met by 192 923 beneficiaries uniquely attributed to 1960 general internists. After propensity score matching, the MOC-required cohort included 84 215 beneficiaries attributed to 956 MOC-required general internists and the MOC-grandfathered cohort included 69 830 beneficiaries attributed to 974 MOC-grandfathered general internists. In the MOC-required beneficiary cohort, 88% were treated by a general internist who completed MOC compared with less than 1% in the MOC-grandfathered cohort.

As shown in Table 1 and Table 2, cohorts of beneficiaries had similar baseline demographic characteristics, prevalence of chronic conditions, hospitalizations, and costs. All differences were statistically insignificant (P values >.05) and small (standardized differences <.10) with the exception of general internist age, which was expected considering their initial certification year. For instance, mean baseline cost difference between MOC-required and MOC-grandfathered beneficiary cohorts was $24 (0.5%; P = .78).

As would be expected due to beneficiary aging during the study period, all outcome measures displayed in Table 3 increased during the pre-MOC to post-MOC period. Table 3 also shows that the MOC requirement was not statistically associated with growth differences between the MOC-required and MOC-grandfathered beneficiary cohorts in the annual incidence of ACSHs per 1000 beneficiaries: total ACSH, 0.1 (95% CI, −1.7 to 1.9), P = .92; acute ACSH, 0.3 (95% CI, −1.1 to 1.6), P = .70; or chronic ACSH, −0.2 (95% CI, −1.5 to 1.2), P = .80. Moreover, the differences were small (eg, the total ACSHs association was 0.2% of the overall mean [0.1/53]). These associations were also nonsignificant for annual incidence of a hospital admission or emergency department visit (P values >.07). The MOC requirement was associated with a small statistically significant annual cost cohort growth difference of −$167 during the post-MOC period (95% CI, −$271 to −$64, P = .002). This corresponds to 2.5% of mean overall cost (−$167/$6814). This estimate was −$84 for ambulatory costs (95% CI, −$122 to −$47, P < .001; 2.3% of mean costs [$84/$3677]) and −$82 for inpatient costs (95% CI, −$162 to −$2, P = .05; 2.6% of mean costs [$82/$3137]). We also found that the MOC requirement was negatively associated with specialty office visits, nonspecialty office visits, laboratory testing, and imaging costs (P values <.01). Major or minor procedure costs were not statistically significantly associated with the MOC requirement (P values >.47).

Yearly unadjusted mean estimates of ACSHs and total costs across cohorts are presented in Figure 3. This Figure illustrates that before 2001, ACSHs and costs were nearly identical across cohorts. This pattern continued for ACSHs in the post-2001 period. However, after 2001, levels of total costs for the MOC-required cohort, although continuing to increase, became consistently lower than costs for the MOC-grandfathered cohort.

As shown in Table 4, MOC was not significantly associated with ACSH cohort growth differences in any particular year post-MOC. For costs, these associations were generally statistically significant and ranged from a −$121 in 2005 (95% CI, −$274 to $31, P = .12) to −$191 in 2002 (95% CI, −$335 to −$47, P = .01) per beneficiary year. These yearly associations displayed no trajectory and individual year associations with the MOC requirement were not statistically different from one another (P values >.10). As shown in the Supplement, trend cohort interactions were not significant.

As shown in sensitivity analyses reported in the Supplement, associations between the MOC requirement and inpatient or nonspecialty office visit costs were statistically insignificant across several attribution and specification sensitivities. Estimates for total, ambulatory, specialty office visit, laboratory testing, and imaging costs remained statistically significant across various sensitivity tests, with varying, but still meaningful, magnitudes. For instance, the association between total costs and the MOC requirement varied from −$172 (95% CI, −$280 to −$64, P = .002) to$ −85 (95% CI, −$139 to −$30, P = .002) per beneficiary year across specification and attribution sensitivities.

Discussion

A number of editorials have called for rigorous evaluations of the ABIM MOC requirement.19-21 The goal of this study was to address these calls by taking advantage of a natural experiment involving implementation of the original MOC requirement that allowed us to compare outcomes between 2 beneficiary cohorts treated by similar general internists who were and were not required to complete the MOC program in 2001. Our investigation did not find a statistically significant association between imposition of the MOC requirement and differences in growth in the annual incidence of ACSHs between these cohorts, but did find a small negative statistically significant association with growth differences for health care costs.

Presumably, differences in cost growth could accrue from improvements in patient health or more judicious use of resources associated with MOC.22-24 We were unable to find evidence of improvements in health as the MOC requirement was not associated with ACSHs, which are triggered by poor health. Supporting the contention that MOC led to efficiency gains, we found that the MOC requirement was associated with a decreased growth in costs related to laboratory tests, imaging, and specialty visits.

What aspects of the MOC process might have produced these efficiencies? During the study period, MOC consisted of passing a secure examination and completing medical knowledge modules. These components consisted of vignette-based multiple-choice questions that provided general internists with feedback on their performance and were generally designed to help them use resources appropriately, particularly imaging and other diagnostic tests. Studying for the examination and completing medical knowledge modules may have led to more efficient care by reducing the need for referrals and consultations or reducing the ordering of low-value tests.6,25

The implied cost savings resulting from our estimates of reductions in cost growth between cohorts associated with the MOC requirement per beneficiary were around 2.5%, a change that may not have been noticeable to a typical general internist. Although our results are drawn from a small subset of Medicare beneficiaries, general internal medicine physicians represent about 45% of adult primary care physicians.26 Moreover, MOC was not targeted at any particular insurance or patient age group. Therefore, these small individual savings may be large at the population level because there are nearly 50 million Medicare beneficiaries27 with over half a trillion dollars in annual Medicare health care expenditures, exclusive of cost borne by beneficiaries. Furthermore, Medicare expenditures constitute nearly a quarter of total health expenditures.28 Therefore, although more research is needed, even if the true association with MOC is a small fraction of what we report, our estimates suggest savings likely far exceed the direct costs of administering the MOC program (for instance fees associated with the current enhanced MOC program for general internists and subspecialists in fiscal year 2013 were $19 million29) as well as the indirect costs borne by physicians (eg, time spent studying for the MOC examination and completing medical knowledge modules).

There are several limitations to our study. First, our attribution method resulted in a beneficiary population with a strong connection with their general internist and so our results may not generalize to other populations. However, sensitivity tests suggest that our results are relatively robust across looser and tighter attributions. Second, our results are only generalizable to fee-for-service Medicare beneficiaries 65 years and older. Third, the data did not allow us to control for characteristics of the general internists’ practice that changed over our study period. Fourth, we were unable to assess a number of important outcomes including patient satisfaction and quality of life. Fifth, the validity of ACSHs as an indicator of quality has been questioned in recent studies.30,31 In particular, the ACSH measures we applied have been shown to be more related to a patient’s access to care than quality at the point of care.32,33

A more general study limitation is the applicability to the current MOC requirement. Before 2006, the MOC requirement consisted of medical knowledge self-evaluation modules combined with an examination of knowledge, diagnostic reasoning, and clinical judgment skills. Although these components remain in place, the MOC process has been enhanced to include a greater emphasis on evaluating outcomes from clinical practice, quality improvement, physician-patient communication, care coordination skills, and patient satisfaction. These new elements may influence any of the effects of MOC, especially the self-evaluation of clinical practice requirement that targets appropriate and inappropriate uses of procedures and tests as well as outcomes of care. Another change to MOC that is not accounted for in this analysis is the recent change to require more continuous participation in MOC activities.34 In addition, secular changes in health care delivery since the early 2000s, such as the expansion of electronic health records and clinical technologies, may narrow the gap between physicians of varying quality.

Considering these limitations, the research should be replicated for the current MOC program using more robust measures of care quality, across other patient populations as well as across internal medicine subspecialties, and for other certification boards’ MOC requirements. With respect to the study findings, more research is needed to determine whether the negative associations we report between MOC and growth in costs were due to improvements in care quality not captured by our quality measure, reductions in wasteful practices unrelated to patient outcomes, or negative consequences not captured by our outcome measures.

Conclusions

Imposition of the MOC requirement was not associated with a difference in the increase in ACSHs but was associated with a small reduction in the growth differences of costs for a cohort of Medicare beneficiaries.

Back to top
Article Information

Corresponding Author: Bradley M. Gray, PhD, 510 Walnut St, Ste 1700, Philadelphia, PA 19106-3699 (bradleygraypub2014@gmail.com).

Author Contributions: Dr Gray had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Gray, Reschovsky, Lynn, Holmboe, Lipner.

Acquisition, analysis, or interpretation of data: Gray, Vandergrift, Johnston, Reschovsky, Holmboe, McCullough, Lipner.

Drafting of the manuscript: Gray, Vandergrift, Johnston, Reschovsky, Lynn, Holmboe, Lipner.

Critical revision of the manuscript for important intellectual content: Gray, Vandergrift, Reschovsky, Lynn, Holmboe, McCullough, Lipner.

Statistical analysis: Gray, Vandergrift, Johnston, Reschovsky, McCullough, Lipner.

Administrative, technical, or material support: Vandergrift, Johnston, Reschovsky, Lynn, Lipner.

Study supervision: Gray, Lipner.

Conflict of Interest Disclosures: All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest. Dr Holmboe reports being an employee of the Accreditation Council for Graduate Medical Education, but was an employee of the American Board of Internal Medicine at the study’s inception; and receiving royalties from Mosby-Elsevier for a textbook on assessment of clinical competence. No other disclosures were reported.

Funder/Sponsor: Financial and material support was provided by the American Board of Internal Medicine (ABIM).

Role of the Funder/Sponsor: The ABIM had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

References
1.
Miller  SH.  American Board of Medical Specialties and repositioning for excellence in lifelong learning: Maintenance of Certification.  J Contin Educ Health Prof. 2005;25(3):151-156.PubMedGoogle ScholarCrossref
2.
Levinson  W, King  TE  Jr, Goldman  L, Goroll  AH, Kessler  B.  Clinical decisions: American Board of Internal Medicine Maintenance of Certification program.  N Engl J Med. 2010;362(10):948-952.PubMedGoogle ScholarCrossref
3.
Norcini  JJ, Lipner  RS, Kimball  HR.  Certifying examination performance and patient outcomes following acute myocardial infarction.  Med Educ. 2002;36(9):853-859.PubMedGoogle ScholarCrossref
4.
Pham  HH, Schrag  D, Hargraves  JL, Bach  PB.  Delivery of preventive services to older adults by primary care physicians.  JAMA. 2005;294(4):473-481.PubMedGoogle ScholarCrossref
5.
Holmboe  ES, Wang  Y, Meehan  TP,  et al.  Association between Maintenance of Certification examination scores and quality of care for Medicare beneficiaries.  Arch Intern Med. 2008;168(13):1396-1403.PubMedGoogle ScholarCrossref
6.
Sirovich  BE, Lipner  RS, Johnston  M, Holmboe  ES.  The association between residency training and internists’ ability to practice conservatively [published online September 1, 2014].  JAMA Intern Med. 2014;174(10):1640-1648. PubMedGoogle ScholarCrossref
7.
Austin  PC.  Optimal caliper widths for propensity-score matching when estimating differences in means and differences in proportions in observational studies.  Pharm Stat. 2011;10(2):150-161.PubMedGoogle ScholarCrossref
8.
Austin  PC.  An introduction to propensity score methods for reducing the effects of confounding in observational studies.  Multivariate Behav Res. 2011;46(3):399-424.PubMedGoogle ScholarCrossref
9.
Agency for Healthcare Research and Quality.  Quality indicator user guide: prevention quality indicators (PQI) composite measures.http://www.qualityindicators.ahrq.gov/Downloads/Modules/PQI/V44/Composite_User_Technical_Specification_PQI_V4.4.pdf. Accessed August 12, 2014.
10.
McWilliams  JM, Landon  BE, Chernew  ME.  Changes in health care spending and quality for Medicare beneficiaries associated with a commercial ACO contract.  JAMA. 2013;310(8):829-836.PubMedGoogle ScholarCrossref
11.
US Bureau of Labor Statistics.  Consumer price index, all urban consumers.http://www.bls.gov/cpi/#tables. Accessed July 1, 2013.
12.
Alemayehu  B, Warner  KE.  The lifetime distribution of health care costs.  Health Serv Res. 2004;39(3):627-642.PubMedGoogle ScholarCrossref
13.
Meerding  WJ, Bonneux  L, Polder  JJ, Koopmanschap  MA, van der Maas  PJ.  Demographic and epidemiological determinants of healthcare costs in Netherlands: cost of illness study.  BMJ. 1998;317(7151):111-115.PubMedGoogle ScholarCrossref
14.
Pope  G, Kautter  J, Ingber  M, Freeman  S, Sekar  R, Newhart  C.  Evaluation of the CMS-HCC risk adjustment model: final report.https://www.cms.gov/Medicare/Health-Plans/MedicareAdvtgSpecRateStats/downloads/Evaluation_Risk_Adj_Model_2011.pdf. Accessed May 12, 2014.
15.
StataCorp.  Stata Statistical Software: Release 13. http://www.stata.com/manuals13/me.pdf. Accessed October 16, 2014.
16.
Friedberg  MW, Schneider  EC, Rosenthal  MB, Volpp  KG, Werner  RM.  Association between participation in a multipayer medical home intervention and changes in quality, utilization, and costs of care.  JAMA. 2014;311(8):815-825.PubMedGoogle ScholarCrossref
17.
Colla  CH, Wennberg  DE, Meara  E,  et al.  Spending differences associated with the Medicare Physician Group Practice Demonstration.  JAMA. 2012;308(10):1015-1023.PubMedGoogle ScholarCrossref
18.
Song  Z, Safran  DG, Landon  BE,  et al.  Health care spending and quality in year 1 of the alternative quality contract.  N Engl J Med. 2011;365(10):909-918.PubMedGoogle ScholarCrossref
19.
Kritek  PA, Drazen  JM.  Clinical decisions: American Board of Internal Medicine Maintenance of Certification program—polling results.  N Engl J Med. 2010;362(15):e54.PubMedGoogle ScholarCrossref
20.
Goldman  L, Goroll  A, Kessler  B.  Do not enroll in the current MOC program.  N Engl J Med. 2010;362:950-952. doi:10.1056/NEJMclde0911205.Google Scholar
21.
Iglehart  JK, Baron  RB.  Ensuring physicians’ competence—is Maintenance of Certification the answer?  N Engl J Med. 2012;367(26):2543-2549.PubMedGoogle ScholarCrossref
22.
Hawkins  RE, Lipner  RS, Ham  HP, Wagner  R, Holmboe  ES.  American Board of Medical Specialties Maintenance of Certification: theory and evidence regarding the current framework.  J Contin Educ Health Prof. 2013;33(S1)(suppl 1):S7-S19.PubMedGoogle ScholarCrossref
23.
Pham  HH, Landon  BE, Reschovsky  JD, Wu  B, Schrag  D.  Rapidity and modality of imaging for acute low back pain in elderly patients.  Arch Intern Med. 2009;169(10):972-981.PubMedGoogle ScholarCrossref
24.
Starfield  B, Powe  NR, Weiner  JR,  et al.  Costs vs quality in different types of primary care settings.  JAMA. 1994;272(24):1903-1908.PubMedGoogle ScholarCrossref
25.
Norcini  J, Anderson  B, Bollela  V,  et al.  Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference.  Med Teach. 2011;33(3):206-214.PubMedGoogle ScholarCrossref
26.
Petterson  SM, Liaw  WR, Phillips  RL  Jr, Rabin  DL, Meyers  DS, Bazemore  AW.  Projecting US primary care physician workforce needs: 2010-2025.  Ann Fam Med. 2012;10(6):503-509.PubMedGoogle ScholarCrossref
27.
The Kaiser Family Foundation’s State Health Facts.  Data source: CMS state/county penetration file: total number of Medicare beneficiaries 2012.http://kff.org/medicare/state-indicator/total-medicare-beneficiaries/. Accessed August 12, 2014.
28.
Medicare Payment Advisory Commission.  A data book: health care spending and the Medicare program.http://www.medpac.gov/-documents-/data-book. Accessed August 12, 2014.
29.
American Board of Internal Medicine.  Where does the money go?http://www.abim.org/pdf/revenue-expenses.pdf. Accessed August 14, 2014.
30.
Davies  S, McDonald  KM, Schmidt  E, Schultz  E, Geppert  J, Romano  PS.  Expanding the uses of AHRQ’s prevention quality indicators: validity from the clinician perspective.  Med Care. 2011;49(8):679-685.PubMedGoogle ScholarCrossref
31.
Landon  BE, O’Malley  AJ, McKellar  MR, Hadley  J, Reschovsky  JD.  Higher practice intensity is associated with higher quality of care but more avoidable admissions for Medicare beneficiaries.  J Gen Intern Med. 2014;29(8):1188-1194.PubMedGoogle ScholarCrossref
32.
Zhan  C, Miller  MR, Wong  H, Meyer  GS.  The effects of HMO penetration on preventable hospitalizations.  Health Serv Res. 2004;39(2):345-361.PubMedGoogle ScholarCrossref
33.
Roos  LL, Walld  R, Uhanova  J, Bond  R.  Physician visits, hospitalizations, and socioeconomic status: ambulatory care–sensitive conditions in a Canadian setting.  Health Serv Res. 2005;40(4):1167-1185.PubMedGoogle ScholarCrossref
34.
American Board of Internal Medicine.  What changed in 2014?http://www.abim.org/maintenance-of-certification/faq-2014-program-changes/what-changed-in-2014.aspx. Accessed August 12, 2014.
×