[Skip to Content]
Sign In
Individual Sign In
Create an Account
Institutional Sign In
OpenAthens Shibboleth
[Skip to Content Landing]
Figure.
Time for Practices to Report Different Electronic Clinical Quality Measures (eCQMs)
Time for Practices to Report Different Electronic Clinical Quality Measures (eCQMs)

The median (interquartile range) time to report any measure was 8.2 (4.6-11.9) months, with a minimum of 7.8 months for the blood pressure measure to a maximum of 10.5 months for the cholesterol measure.

Table 1.  
Practice Characteristics
Practice Characteristics
Table 2.  
Time to Report Aspirin Prescribing, Blood Pressure Control, Cholesterol Management, and Smoking Cessation Electronic Clinical Quality Measures
Time to Report Aspirin Prescribing, Blood Pressure Control, Cholesterol Management, and Smoking Cessation Electronic Clinical Quality Measures
Table 3.  
Univariable Analyses of Practice Characteristics Associated With Less Time to Report Certain Electronic Clinical Quality Measures
Univariable Analyses of Practice Characteristics Associated With Less Time to Report Certain Electronic Clinical Quality Measures
Table 4.  
Final Multivariable Models of Select Practice Characteristics Associated With Ability to Report Electronic Clinical Quality Measures
Final Multivariable Models of Select Practice Characteristics Associated With Ability to Report Electronic Clinical Quality Measures
1.
Jha  AK.  Meaningful use of electronic health records: the road ahead.  JAMA. 2010;304(15):1709-1710. doi:10.1001/jama.2010.1497PubMedGoogle ScholarCrossref
2.
Blumenthal  D, Tavenner  M.  The “meaningful use” regulation for electronic health records.  N Engl J Med. 2010;363(6):501-504. doi:10.1056/NEJMp1006114PubMedGoogle ScholarCrossref
3.
Centers for Disease Control and Prevention. Public health and promoting interoperability programs (formerly, known as electronic health records meaningful use). https://www.cdc.gov/ehrmeaningfuluse/introduction.html. Accessed January 21, 2019.
4.
Panzer  RJ, Gitomer  RS, Greene  WH, Webster  PR, Landry  KR, Riccobono  CA.  Increasing demands for quality measurement.  JAMA. 2013;310(18):1971-1980. doi:10.1001/jama.2013.282047PubMedGoogle ScholarCrossref
5.
Petersen  LA, Woodard  LD, Urech  T, Daw  C, Sookanan  S.  Does pay-for-performance improve the quality of health care?  Ann Intern Med. 2006;145(4):265-272. doi:10.7326/0003-4819-145-4-200608150-00006PubMedGoogle ScholarCrossref
6.
Epstein  AM, Lee  TH, Hamel  MB.  Paying physicians for high-quality care.  N Engl J Med. 2004;350(4):406-410. doi:10.1056/NEJMsb035374PubMedGoogle ScholarCrossref
7.
Casalino  LP, Gans  D, Weber  R,  et al.  US physician practices spend more than $15.4 billion annually to report quality measures.  Health Aff (Millwood). 2016;35(3):401-406. doi:10.1377/hlthaff.2015.1258PubMedGoogle ScholarCrossref
8.
Blumenthal  D, McGinnis  JM.  Measuring Vital Signs: an IOM report on core metrics for health and health care progress.  JAMA. 2015;313(19):1901-1902. doi:10.1001/jama.2015.4862PubMedGoogle ScholarCrossref
9.
Kern  LM, Malhotra  S, Barrón  Y,  et al.  Accuracy of electronically reported “meaningful use” clinical quality measures: a cross-sectional study.  Ann Intern Med. 2013;158(2):77-83. doi:10.7326/0003-4819-158-2-201301150-00001PubMedGoogle ScholarCrossref
10.
Chan  KS, Fowles  JB, Weiner  JP.  Review: electronic health records and the reliability and validity of quality measures: a review of the literature.  Med Care Res Rev. 2010;67(5):503-527. doi:10.1177/1077558709359007PubMedGoogle ScholarCrossref
11.
Heisey-Grove  DM, Wall  HK, Wright  JS.  Electronic clinical quality measure reporting challenges: findings from the Medicare EHR Incentive Program’s Controlling High Blood Pressure measure.  J Am Med Inform Assoc. 2018;25(2):127-134. doi:10.1093/jamia/ocx049PubMedGoogle ScholarCrossref
12.
Fernald  DH, Wearner  R, Dickinson  WP.  The journey of primary care practices to meaningful use: a Colorado Beacon Consortium study.  J Am Board Fam Med. 2013;26(5):603-611. doi:10.3122/jabfm.2013.05.120344PubMedGoogle ScholarCrossref
13.
Roth  CP, Lim  Y-W, Pevnick  JM, Asch  SM, McGlynn  EA.  The challenge of measuring quality of care from the electronic health record.  Am J Med Qual. 2009;24(5):385-394. doi:10.1177/1062860609336627PubMedGoogle ScholarCrossref
14.
Cohen  DJ, Dorr  DA, Knierim  K,  et al.  Primary care practices’ abilities and challenges in using electronic health record data for quality improvement.  Health Aff (Millwood). 2018;37(4):635-643. doi:10.1377/hlthaff.2017.1254PubMedGoogle ScholarCrossref
15.
Rao  SR, Desroches  CM, Donelan  K, Campbell  EG, Miralles  PD, Jha  AK.  Electronic health records in small physician practices: availability, use, and perceived benefits.  J Am Med Inform Assoc. 2011;18(3):271-275. doi:10.1136/amiajnl-2010-000010PubMedGoogle ScholarCrossref
16.
Meyer  GS, Nelson  EC, Pryor  DB,  et al.  More quality measures versus measuring what matters: a call for balance and parsimony.  BMJ Qual Saf. 2012;21(11):964-968. doi:10.1136/bmjqs-2012-001081PubMedGoogle ScholarCrossref
17.
Conway  PH, Mostashari  F, Clancy  C.  The future of quality measurement for improvement and accountability.  JAMA. 2013;309(21):2215-2216. doi:10.1001/jama.2013.4929PubMedGoogle ScholarCrossref
18.
Phillips  R.  The PRIME Registry helps thousands of primary care clinicians liberate EHR data and prepare for MIPS.  J Am Board Fam Med. 2017;30(4):559. doi:10.3122/jabfm.2017.04.170193Google ScholarCrossref
19.
Jortberg  BT, Fernald  DH, Dickinson  LM,  et al.  Curriculum redesign for teaching the PCMH in Colorado Family Medicine Residency programs.  Fam Med. 2014;46(1):11-18.PubMedGoogle Scholar
20.
Sessums  LL, McHugh  SJ, Rajkumar  R.  Medicare’s vision for advanced primary care: new directions for care delivery and payment.  JAMA. 2016;315(24):2665-2666. doi:10.1001/jama.2016.4472PubMedGoogle ScholarCrossref
21.
Centers for Medicare & Medicaid Services. Transforming Clinical Practice Initiative. https://innovation.cms.gov/initiatives/Transforming-Clinical-Practices. Accessed January 21, 2019.
22.
L&M Policy Research.  Innovation Center State-Based Initiatives: A Systematic Review of Lessons Learned. Baltimore, MD: Centers for Medicare & Medicaid Services; 2018.
23.
English  AF, Dickinson  LM, Zittleman  L,  et al.  A community engagement method to design patient engagement materials for cardiovascular health.  Ann Fam Med. 2018;16(suppl 1):S58-S64. doi:10.1370/afm.2173PubMedGoogle ScholarCrossref
24.
Ogrinc  G, Davies  L, Goodman  D, Batalden  P, Davidoff  F, Stevens  D.  SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): revised publication guidelines from a detailed consensus process.  BMJ Qual Saf. 2016;25(12):986-992. doi:10.1136/bmjqs-2015-004411PubMedGoogle ScholarCrossref
25.
eCQI Resource Center. Ischemic vascular disease (IVD): use of aspirin or another antiplatelet. https://ecqi.healthit.gov/ecqm/measures/cms164v5. Updated May 16, 2019. Accessed August 16, 2018.
26.
eCQI Resource Center. Controlling high blood pressure. https://ecqi.healthit.gov/ecqm/measures/cms165v6. Accessed June 26, 2019.
27.
eCQI Resource Center. Statin therapy for the prevention and treatment of cardiovascular disease. https://ecqi.healthit.gov/ep/ecqms-2018-performance-period/statin-therapy-prevention-and-treatment-cardiovascular-disease. Updated May 16, 2019. Accessed August 16, 2018.
28.
eCQI Resource Center. Preventive care and screening: tobacco use: screening and cessation intervention. https://ecqi.healthit.gov/ecqm/measures/cms138v6. Accessed August 16, 2018.
29.
Wright  JS, Wall  HK, Briss  PA, Schooley  M.  Million hearts—where population health and clinical practice intersect.  Circ Cardiovasc Qual Outcomes. 2012;5(4):589-591. doi:10.1161/CIRCOUTCOMES.112.966978PubMedGoogle ScholarCrossref
30.
Cohen  DJ, Balasubramanian  BA, Gordon  L,  et al.  A national evaluation of a dissemination and implementation initiative to enhance primary care practice capacity and improve cardiovascular disease care: the ESCALATES study protocol.  Implement Sci. 2016;11(1):86. doi:10.1186/s13012-016-0449-8PubMedGoogle ScholarCrossref
31.
DARTNet Institute. DARTNet website. http://www.dartnet.info/. Accessed August 16, 2018.
32.
United States Department of Agriculture. Rural-Urban Commuting Area Codes. https://www.ers.usda.gov/data-products/rural-urban-commuting-area-codes/. Accessed August 16, 2018.
33.
Hosmer  D, Lemeshow  S, Sturdivant  R. Model-building strategies and methods for logistic regression. In: Shewhart  WA, Wilks  SS, eds.  Applied Logistic Regression. Hoboken, NJ: John Wiley & Sons; 2000. doi:10.1002/0471722146
34.
Fleming  NS, Culler  SD, McCorkle  R, Becker  ER, Ballard  DJ.  The financial and nonfinancial costs of implementing electronic health records in primary care practices.  Health Aff (Millwood). 2011;30(3):481-489. doi:10.1377/hlthaff.2010.0768PubMedGoogle ScholarCrossref
35.
Menachemi  N, Collum  TH.  Benefits and drawbacks of electronic health record systems.  Risk Manag Healthc Policy. 2011;4:47-55. doi:10.2147/RMHP.S12985PubMedGoogle ScholarCrossref
36.
Terry  AL, Thorpe  CF, Giles  G,  et al.  Implementing electronic health records: key factors in primary care.  Can Fam Physician. 2008;54(5):730-736.PubMedGoogle Scholar
37.
Buscaj  E, Hall  T, Montgomery  L,  et al.  Practice facilitation for PCMH implementation in residency practices.  Fam Med. 2016;48(10):795-800.PubMedGoogle Scholar
38.
Friedberg  MW, Coltin  KL, Safran  DG, Dresser  M, Zaslavsky  AM, Schneider  EC.  Associations between structural capabilities of primary care practices and performance on selected quality measures.  Ann Intern Med. 2009;151(7):456-463. doi:10.7326/0003-4819-151-7-200910060-00006PubMedGoogle ScholarCrossref
39.
DesRoches  CM, Audet  AM, Painter  M, Donelan  K.  Meeting meaningful use criteria and managing patient populations: a national survey of practicing physicians.  Ann Intern Med. 2013;158(11):791-799. doi:10.7326/0003-4819-158-11-201306040-00003PubMedGoogle ScholarCrossref
40.
Miller  RH, Sim  I.  Physicians’ use of electronic medical records: barriers and solutions.  Health Aff (Millwood). 2004;23(2):116-126. doi:10.1377/hlthaff.23.2.116PubMedGoogle ScholarCrossref
41.
Ryan  AM, Bishop  TF, Shih  S, Casalino  LP.  Small physician practices in New York needed sustained help to realize gains in quality from use of electronic health records.  Health Aff (Millwood). 2013;32(1):53-62. doi:10.1377/hlthaff.2012.0742PubMedGoogle ScholarCrossref
42.
Jones  SS, Rudin  RS, Perry  T, Shekelle  PG.  Health information technology: an updated systematic review with a focus on meaningful use.  Ann Intern Med. 2014;160(1):48-54. doi:10.7326/M13-1531PubMedGoogle ScholarCrossref
43.
Goetz Goldberg  D, Kuzel  AJ, Feng  LB, DeShazo  JP, Love  LE.  EHRs in primary care practices: benefits, challenges, and successful strategies.  Am J Manag Care. 2012;18(2):e48-e54.PubMedGoogle Scholar
44.
Kanger  C, Brown  L, Mukherjee  S, Xin  H, Diana  ML, Khurshid  A.  Evaluating the reliability of EHR-generated clinical outcomes reports: a case study.  EGEMS (Wash DC). 2014;2(3):1102.PubMedGoogle Scholar
45.
Berg  M.  Implementing information systems in health care organizations: myths and challenges.  Int J Med Inform. 2001;64(2-3):143-156. doi:10.1016/S1386-5056(01)00200-3PubMedGoogle ScholarCrossref
46.
Dickinson  LM, Dickinson  WP, Nutting  PA,  et al.  Practice context affects efforts to improve diabetes care for primary care patients: a pragmatic cluster randomized trial.  J Gen Intern Med. 2015;30(4):476-482. doi:10.1007/s11606-014-3131-3PubMedGoogle ScholarCrossref
47.
Crabtree  BF, Nutting  PA, Miller  WL,  et al.  Primary care practice transformation is hard work: insights from a 15-year developmental program of research.  Med Care. 2011;49(suppl):S28-S35. doi:10.1097/MLR.0b013e3181cad65cPubMedGoogle ScholarCrossref
48.
Hemler  JR, Hall  JD, Cholan  RA,  et al.  Practice facilitator strategies for addressing electronic health record data challenges for quality improvement: EvidenceNOW.  J Am Board Fam Med. 2018;31(3):398-409. doi:10.3122/jabfm.2018.03.170274PubMedGoogle ScholarCrossref
49.
Heisey-Grove  D, King  JA.  Physician and practice-level drivers and disparities around meaningful use progress.  Health Serv Res. 2017;52(1):244-267. doi:10.1111/1475-6773.12481PubMedGoogle ScholarCrossref
50.
Rittenhouse  DR, Ramsay  PP, Casalino  LP, McClellan  S, Kandel  ZK, Shortell  SM.  Increased health information technology adoption and use among small primary care physician practices over time: a national cohort study.  Ann Fam Med. 2017;15(1):56-62. doi:10.1370/afm.1992PubMedGoogle ScholarCrossref
51.
Hsiao  CJ, Hing  E.  Use and characteristics of electronic health record systems among office-based physician practices: United States, 2001-2013.  NCHS Data Brief. 2014;(143):1-8.PubMedGoogle Scholar
52.
Kruse  CS, DeShazo  J, Kim  F, Fulton  L.  Factors associated with adoption of health information technology: a conceptual model based on a systematic review.  JMIR Med Inform. 2014;2(1):e9. doi:10.2196/medinform.3106PubMedGoogle ScholarCrossref
Limit 200 characters
Limit 25 characters
Conflicts of Interest Disclosure

Identify all potential conflicts of interest that might be relevant to your comment.

Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.

Err on the side of full disclosure.

If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.

Not all submitted comments are published. Please see our commenting policy for details.

Limit 140 characters
Limit 3600 characters or approximately 600 words
    Views 445
    Original Investigation
    Health Informatics
    August 7, 2019

    Primary Care Practices’ Ability to Report Electronic Clinical Quality Measures in the EvidenceNOW Southwest Initiative to Improve Heart Health

    Author Affiliations
    • 1University of Colorado School of Medicine, Department of Family Medicine, Aurora
    • 2University of New Mexico School of Medicine, Department of Family and Community Medicine, Albuquerque
    JAMA Netw Open. 2019;2(8):e198569. doi:10.1001/jamanetworkopen.2019.8569
    Key Points español 中文 (chinese)

    Question  How quickly can primary care practices report electronic clinical quality measures based on evidence-based guidelines for cardiac care?

    Findings  In this quality improvement study of 211 primary care practices, the median time to report any baseline electronic clinical quality measure was 8.2 months. Time to report varied by measure type and practice characteristics.

    Meaning  This study suggests that clinical quality measure reporting still takes a great deal of time and effort, and as the health care system increasingly moves to value-based structures that require electronic clinical quality measures, some practices may be left behind without better incentives and support.

    Abstract

    Importance  The capability and capacity of primary care practices to report electronic clinical quality measures (eCQMs) are questionable.

    Objective  To determine how quickly primary care practices can report eCQMs and the practice characteristics associated with faster reporting.

    Design, Setting, and Participants  This quality improvement study examined an initiative (EvidenceNOW Southwest) to enhance primary care practices’ ability to adopt evidence-based cardiovascular care approaches: aspirin prescribing, blood pressure control, cholesterol management, and smoking cessation (ABCS). A total of 211 primary care practices in Colorado and New Mexico participating in EvidenceNOW Southwest between February 2015 and December 2017 were included.

    Interventions  Practices were instructed on eCQM specifications that could be produced by an electronic health record, a registry, or a third-party platform. Practices received 9 months of support from a practice facilitator, a clinical health information technology advisor, and the research team. Practices were instructed to report their baseline ABCS eCQMs as soon as possible.

    Main Outcomes and Measures  The main outcome was time to report the ABCS eCQMs. Cox proportional hazards models were used to examine practice characteristics associated with time to reporting.

    Results  Practices were predominantly clinician owned (48%) and in urban or suburban areas (71%). Practices required a median (interquartile range) of 8.2 (4.6-11.9) months to report any ABCS eCQM. Time to report differed by eCQM: practices reported blood pressure management the fastest (median [interquartile range], 7.8 [3.5-10.4] months) and cholesterol management the slowest (median [interquartile range], 10.5 [6.6 to >12] months) (log-rank P < .001). In multivariable models, the blood pressure eCQM was reported more quickly by practices that participated in accountable care organizations (hazard ratio [HR], 1.88; 95% CI, 1.40-2.53; P < .001) or participated in a quality demonstration program (HR, 1.58; 95% CI, 1.14-2.18; P = .006). The cholesterol eCQM was reported more quickly by practices that used clinical guidelines for cardiovascular disease management (HR, 1.35; 95% CI, 1.18-1.53; P < .001). Compared with Federally Qualified Health Centers, hospital-owned practices had greater ability to report blood pressure eCQMs (HR, 2.66; 95% CI, 95% CI, 1.73-4.09; P < .001), and clinician-owned practices had less ability to report cholesterol eCQMs (HR, 0.52; 95% CI, 0.35-0.76; P < .001).

    Conclusions and Relevance  In this study, time to report eCQMs varied by measure and practice type, with very few practices reporting quickly. Practices took longer to report a new cholesterol measure than other measures. Programs that require eCQM reporting should consider the time and effort practices must exert to produce reports. Practices may benefit from additional support to succeed in new programs that require eCQM reporting.

    Introduction

    The Health Information Technology for Economic and Clinical Health (HITECH) Act, passed in 2009 as a part of the American Recovery and Reinvestment Act, specified general guidelines for the development and implementation of a “nationwide health information technology infrastructure.”1 Through HITECH Act initiatives, the federal government has spent significant time and money to promote widespread adoption of electronic health records (EHRs) that were intended to improve the quality, safety, efficiency, coordination, and equity of health care in the United States.2,3 Among other purposes, EHRs were to offer a standardized platform to better demonstrate gains in these domains. A key feature of the infrastructure was promotion of clinical quality with reporting measures that would be collected and reported using certified EHR systems.

    The increasing prevalence of EHRs has prompted the electronic extraction of clinical quality measures (eCQMs) to become the standard for quality reporting programs. Reporting burden has grown over time, with increasing requirements to report eCQMs for a variety of quality initiatives4 and value-based reimbursement structures.5,6 This growing burden has major implications for resource allocation: estimates suggest that the time primary care team members spend on eCQM reporting equates to billions of dollars per year.7 Thousands of eCQMs have been developed by independent groups, with different ones used in various governmental or payer initiatives, leading to confusion and fatigue on the part of health care practices.8

    Numerous barriers influence primary care practice teams’ ability to efficiently and accurately report eCQMs, including questionable data accuracy and variation in validity across measures and physicians.9-12 Furthermore, the extent to which eCQMs correspond to quality care and improved outcomes has been questioned.13 Variable data documentation practices greatly affect data completeness and reliability.10,12 Barriers to eCQM reporting and meaningful use of data include the time and effort required to implement reporting processes, resistance to change, limited EHR reporting functionality, costs, inflexible reporting criteria, inconsistency between measures and clinical guidelines, and vendors who were unreceptive to requests for flexible EHR configuration.12,14 Small practices may be more likely to experience financial barriers related to EHR adoption and use.15 Variation in definition of measures, data sources, and data formats may limit the comparability and utility of quality measures across practices.13

    A number of efforts have aimed to reduce the burden of measure reporting on practices by increasing the adoption and meaningful use of health information technology, identifying and addressing gaps in primary care teams’ data skills, focusing on measures that matter,16 improving clarity of measure specifications, and aligning measures across settings and outcomes.17 Professional societies have supported the use of data analytics platforms like PRIME Registry.18 Other known facilitators of eCQM reporting,12,19 such as onsite training, local technical support, and opportunities for harmonization and shared learning, have been advanced by federal and state programs.20-22

    The EvidenceNOW Southwest (ENSW) project offered an opportunity to see whether primary care practices have developed capacity to produce eCQMs. The ENSW project is a collaborative effort between Colorado and New Mexico covering the diverse geographic and cultural regions of both states. It is 1 of 7 regional cooperatives funded by the Agency for Healthcare Research and Quality’s (AHRQ) EvidenceNOW research study that started in 2015 to help small- and medium-sized primary care practices use the latest evidence to improve cardiovascular health. The ENSW project built upon the efforts described to reduce eCQM reporting burden by choosing a minimum number of measures known to prolong life and improve health, matching reporting specifications as closely as possible to these clinical standards, accepting a variety of data sources (eg, EHRs, patient-level extracts, third-party registries), and offering robust technical assistance through onsite clinical health information technology advisors, access to regional experts, and linkages to AHRQ’s national technical assistance contractor.

    In this study, we sought to take advantage of the opportunity presented by the ENSW project’s use of 4 common and standardized eCQMs to examine how quickly primary care practices could report on these eCQMs. Our hypothesis was that, nearly 10 years following the HITECH Act, many primary care practices still do not possess the skills and tools to easily meet basic eCQM reporting requirements and that practices with certain characteristics experience greater delays than others when reporting eCQMs.

    Methods

    Practice recruitment and selection for participation in ENSW has been described elsewhere.23 The ENSW project and the study described in this article were approved by the Colorado Multiple Institutional Review Board and the University of New Mexico Human Research Protections Office. The ENSW project is registered on ClinicalTrials.gov (NCT02515578). Participants completing surveys were provided written information about the study. The need to document consent was waived by the human subjects review boards because they determined that the research presented no more than minimal risk of harm to participants and involved no procedures for which written consent was required outside of the research context. All participants were provided with an informed consent document in the form of an information sheet explaining the research aims, patient rights, and potential risks. This report follows the Standards for Quality Improvement Reporting Excellence (SQUIRE) reporting guideline.24

    Measure Selection and Practice Support

    The AHRQ selected the measures of aspirin use,25 blood pressure control,26 cholesterol management,27 and smoking cessation28 (ABCS) to advance heart health in alignment with the Million Hearts Campaign,29 the National Quality Forum, and the Centers for Medicare & Medicaid Services. The AHRQ selected standard eCQM specifications for use by all practices participating in the 7 cooperatives.30 These specifications included a 12-month measurement period for each quarterly report.

    Recognizing potential challenges to eCQM reporting, in addition to receiving 9 months of ongoing practice transformation support from a trained practice facilitator, ENSW provided practices with support from a clinical health information technology advisor and resources and support from the research team, which had experience collecting eCQMs. This support team assisted practices with developing and managing workflows for data collection, reporting, and analysis; helped with the entry of eCQMs into the reporting website; and linked practices with other technical assistance resources as needed and available. Practices were instructed to report their baseline ABCS eCQMs as soon as possible once their practice transformation support began.

    eCQM Reporting Mechanisms

    The ENSW project offered practices several options to report eCQM data to a centralized repository. The first option allowed practices to calculate eCQM numerators and denominators using an internal EHR or registry. These data were manually entered through an online portal. The other option allowed practices to securely transfer patient-level information to the DARTNet Institute31 through structured flat files or direct EHR data extraction. The DARTNet Institute then normalized the clinical data, calculated the eCQMs, and reported results on the practice’s behalf.

    Practice Characteristics and Context

    Practice characteristics were obtained from the baseline ENSW Practice Survey. At least 1 staff member, typically a practice administrator or lead clinician, completed the Practice Survey for each participating practice. The baseline Practice Survey gathered descriptive information on participating practices, including a series of questions surrounding use of strategies for improving patient care, such as quality improvement processes and patient self-management support.

    Practice ownership was consolidated for these analyses to 3 categories: (1) clinician (including solo or group practices); (2) hospitals and academic centers (including academic health centers, faculty practices, hospital or health system practices, or health maintenance organizations); and (3) Federally Qualified Health Centers (FQHC) (including FQHCs, FQHC lookalike clinics, and Rural Health Clinics). Practice size was defined as the number of clinicians working at that site. Practice zip code aligned to Rural-Urban Commuting Area codes was used to determine geographic area. We assigned practices with zip codes corresponding with Rural-Urban Commuting Area codes 1 to 4 as rural and 5 to 10 as urban or suburban.32 Cardiovascular disease (CVD) registries included a count of the following registries in use at the practice: ischemic vascular disease, hypertension, high cholesterol, diabetes, prevention services, and registries for high-risk patients. Total number of registries was translated into an ordered categorical variable (0, 1-2, 3-4, and 5-6). A score for adoption of CVD guidelines for prevention and management was created by counting the following activities reported by a practice: guidelines posted or distributed, clinicians’ agreed-on guidelines, standing orders created, or EHR prompts for each type of guideline. Accountable care organization (ACO) member options included Medicaid, Medicare, private or commercial, and other.

    The survey also asked about major practice changes, including using a new or different EHR, moving to a new location, losing 1 or more clinicians, losing the office manager or head nurse, being purchased by or affiliating with a larger organization, or implementing a new billing system. Practices were asked if they previously participated in payment or quality demonstration programs including a State Innovation Model initiative, Comprehensive Primary Care Initiative, Transforming Clinical Practice Initiative, Community Health Worker training program, Blue Cross/Blue Shield Patient-Centered Medical Home program, Million Hearts State Learning Collaborative, Million Hearts Cardiovascular Disease Risk Reduction Model, or other program. Previous quality reporting support options included receiving help from any health information exchange, practice-based research network, clinical data warehouse, external consulting group, health system practice network, hospital network, primary care association, or regional extension center. Practice incentive or bonus payment options included the Medicare primary care incentive payment or the Medicare care coordination payment.

    Time to Report

    The primary outcome measure for our analyses was time to reporting. We calculated time to report as a measurement of time in days (converted to months to aid interpretability) from the date of the practice’s kickoff meeting with ENSW transformation support staff to submission of baseline eCQM data for each of the ABCS measures. Using the ENSW kickoff date ensured a discretely recorded, objective time zero uniformly available for all practices in the study.

    Statistical Analysis

    Descriptive statistics were generated for practice characteristics (eg, frequencies, proportions, mean, standard deviation). The outcome variables for all analyses are time to reporting for each eCQM and time to reporting for the first eCQM reported. Practices that had not reported by the end of the assessment period for this analysis (November 1, 2017) were censored as of that date. Practices had a minimum of 7.7 months from the time the practice first received transformation support to the end of the assessment period. The log-rank test was used to generate product-limit curves and compare survival distributions across the measures. For blood pressure and cholesterol, Cox proportional hazards regression models were used to examine practice characteristics that were associated with time to reporting in univariable and multivariable models. Practices that dropped out immediately after the kickoff meeting were excluded from analysis (n = 6); practices that dropped out more than 1 month after kickoff and had not reported measures (n = 3) were censored at the time of dropout. Backward elimination was used to arrive at the final multivariate models, initially including all variables that were significant at P < .10 and eliminating variables 1 at a time until all were P < .05.33 The threshold for statistical significance of results was P < .05 using 2-sided tests. Because of the variable length of assessment periods for practices, sensitivity analyses were performed limiting the observation period to a maximum of 12 months to determine whether there was bias associated with longer observation time for some practices enrolled earlier. All analyses were performed using SAS statistical software version 9.4 (SAS Institute Inc).

    Results

    Data represent 211 enrolled practices that provided survey and eCQM data between January 1, 2015, and November 1, 2017. Table 1 details the characteristics of participating practices. Most practices (75%) were in Colorado. Practices were predominantly clinician owned (48%), located in urban or suburban areas (71%), and used at least 1 patient registry (68%) at baseline. The mean (SD) practice size was 3.5 (2.6) clinicians. Approximately 47% of practices reported participating in some type of ACO. A substantial majority (85%) calculated eCQMs using their EHR or internal registry.

    Time to Report

    The median (interquartile range [IQR]) time to report any measure was 8.2 (4.6-11.9) months. The median (IQR) time to report varied across measures from a minimum of 7.8 (3.5-10.4) months for the blood pressure measure to a maximum of 10.5 (6.6 to >12) months for the cholesterol measure (Table 2). Few practices reported measures within 6 months, ranging from a minimum of 22.8% of practices for the cholesterol measure to a maximum of 34.6% of practices for the blood pressure measure.

    The Figure plots the proportion of practices reporting each eCQM over time. Sensitivity analyses limited the maximum observation time frame to 12 months. Practices demonstrated a lower probability of reporting the cholesterol measure within the 12-month observation period (log-rank test for equality over strata: χ23 = 41.42; P < .001).

    Practices that used the DARTNet Institute reported the cholesterol measure faster (median [IQR] time to report, 7.0 [4.5-9.7] months) than practices using their own EHR (median [IQR] time to report, 8.9 [5.7-14.8] months) (log-rank P = .004). The times to report the 3 other measures were not significantly different.

    Practices in the study used more than 13 different EHRs encompassing 48 different EHR versions, with the most common EHR, NextGen, used by 28% of practices. Because of the relatively small number of practices using any particular EHR, we did not assess these in Cox regression models. We provide median time to report for EHRs used by more than 5 practices in the eTable in the Supplement.

    Practice Characteristics Associated With Ability to Report Certain eCQMs

    Hazard ratios (HRs) from the univariable Cox proportional hazards models are shown in Table 3 for the blood pressure and cholesterol measures. Results for aspirin and smoking cessation measures are not presented because the patterns were similar to blood pressure results.

    Practice characteristics associated with greater ability to report eCQMs varied between the blood pressure and cholesterol measures. Earlier ability to report the blood pressure measure was associated with ownership by clinicians (HR, 1.42; 95% CI, 1.04-1.93) or hospitals (HR, 2.41; 95% CI, 1.58-3.66) vs FQHC, larger size (HR, 1.06; 95% CI, 1.01-1.12), ACO participation (HR, 1.94; 95% CI, 1.44-2.61), greater use of clinical guidelines for CVD management (HR, 1.12; 95% CI, 1.01-1.26), previous quality reporting support (HR, 1.35; 95% CI, 1.001-1.82), and participation in a payment or quality demonstration program (HR, 1.46; 95% CI, 1.07-2.00). Patient-Centered Medical Home recognition was associated with less ability to report the blood pressure eCQM (HR, 0.72; 95% CI, 0.54-0.96). For the cholesterol measure, practice characteristics associated with greater ability to report included being an FQHC vs clinician owned (HR for clinician owned, 0.41; 95% CI, 0.29-0.60), greater use of patient registries (HR, 1.23; 95% CI, 1.08-1.39), greater use of clinical guidelines for CVD prevention and management (HR for prevention, 1.40; 95% CI, 1.24-1.58 and HR for management, 1.41; 95% CI, 1.25-1.59), and participation in a payment or quality demonstration program (HR, 1.46; 95% CI, 1.07-2.00). Receiving incentive payments for Medicare primary care was associated with less ability to report cholesterol eCQMs (HR, 0.75; 95% CI, 0.33-0.76).

    Multivariate models indicated that ACO participation (HR, 1.88; 95% CI, 1.40-2.53; P < .001), hospital ownership vs FQHC (HR, 2.66; 95% CI, 1.73-4.09; P < .001), and participation in a payment or quality demonstration (HR, 1.58; 95% CI, 1.14-2.18; P = .006) were associated with greater ability to report blood pressure management (Table 4). For cholesterol measure reporting, FQHC (vs clinician-owned) practices (HR for clinician ownership, 0.52; 95% CI, 0.35-0.76; P < .001) and greater use of clinical guidelines for CVD management (HR, 1.35; 95% CI, 1.18-1.53; P < .001) were associated with greater ability to report. Results were very similar in sensitivity analyses limiting the maximum time to 12 months (but with slightly less power).

    Discussion

    Our study sought to examine the current capacity of primary care practices to report 4 evidence-based eCQMs. Despite nearly all participating practices using Meaningful Use–certified EHRs and the provision of dedicated health information technology support by the ENSW project, primary care practices still required a substantial amount of time and support to report even well-established eCQMs. The ability to report ABCS eCQMs varied by measure type and practice characteristics.

    Our results highlight how introducing new measures increases the reporting burden on practices. All 4 measures reflected current clinical care guidelines, but the aspirin, blood pressure management, and tobacco cessation measures were more established. Their specifications had been relatively stable, and they have been used for Centers for Medicare & Medicaid Services and other quality and research programs for many years. On the other hand, the cholesterol measure was chosen to reflect a very recent update to clinical guidelines. At the start of the ENSW project, there were no nationally recognized eCQM specifications for calculating the measure. Meaningful Use certification standards did not require the measure, and no payment program used the metric. Compared with blood pressure management, the new cholesterol measure took nearly 3 months longer for the typical practice to report (7.8 months vs 10.5 months, respectively).

    Our results also show that certain types of practices are more capable of prompt eCQM reporting. Practices that participate in ACOs and systematically use clinical guidelines seem better prepared to report established measures like the blood pressure measure and new measures like the cholesterol measure. Hospital-owned practices more quickly reported the established measures, and FQHCs more quickly reported the new cholesterol measure. While these associations do not imply causation, it would stand to reason that some combination of a practice’s skill, previous activity, payment structures, and internal and external resources are leading to the variation in the time to reporting we observed.

    Initial implementation of an EHR system has high costs in terms of time, training, finances, and lost productivity.34-36 Our findings indicate that these barriers do not end once EHR implementation is complete. For the practices we studied, substantial delays often continued well after the initial attempt to report measures. We found that it can take several months for a practice to produce any 1 of 4 standard measures. Implementing a new measure not previously adopted by federal programs like Meaningful Use takes practices even more time. We found a considerable time burden that health care teams face in reporting clinical quality measures, which builds on the previously reported estimate that physicians and their staff spend an average of 15 hours per week developing, collecting, and reporting external quality measures.7

    Our findings agree with others’ conclusions that programs should try to align the amount and forms of health information technology support to best match practices’ needs.37 This article complements the work of Cohen et al,14 which looked at 1492 practices across the national EvidenceNOW project. Those practices reported on their ability to report on eCQMs at the outset of the project and the potential barriers to their use of EHR data for quality purposes. Our article adds to their findings by detailing the actual time to reporting for more than 200 participating practices. Our results are consistent with previous research demonstrating modest but inconsistent associations between select structural elements of primary care practices and performance on various quality measures.38 Practice size was positively associated with ability to report the blood pressure eCQM, which aligns with evidence that smaller practices may experience greater barriers and delays in EHR use than larger practices,39,40 perhaps suggesting that this disparity extends to the ability to report certain measures. Our findings also complement evidence that small practices need sustained and extensive EHR support to achieve improvement in quality measures.41 Beyond these studies, existing literature contains little information regarding the influence of contextual details with the use of health information technology.42

    Barriers to meaningfully implementing EHRs and using EHR data are manifold: costs, lack of knowledge of EHR functions, problems transforming office operations, lack of standardization, vendor system upgrades, lack of dedicated data coordinators, staff and clinician resistance, and fatigue.12,43,44 Reliably reporting individual measures may be further influenced by the interplay—and unpredictability—of multiple factors: that is, the “complexity of the sociotechnical networks at stake.”45 Primary care practices are complex adaptive systems,46,47 and while our findings help identify specific practice characteristics that may be associated with quality measure reporting and performance, how these characteristics interact in any given practice is affected by the local landscape and factors beyond our ability to measure in this study. Practices using EHR data to inform quality improvement need ongoing and tailored support that can assist with addressing these complex factors.48

    Limitations

    This study has several important limitations. Health information technology adoption can vary across regions and practice types,49-52 so generalizability beyond these small- to medium-sized primary care practices in the Southwest United States may be limited. Numerous unmeasured factors may have influenced time to report, including degree of leadership engagement in ENSW, the costs to create reports in the different eCQM production tools, competing demands, and actual time spent trying to produce eCQMs. Measures produced with internal EHRs or registries were not independently verified for accuracy beyond basic validation checks (eg, numerator must be less than or equal to denominator), and many practices further refined data collection workflows and eCQM calculation processes after reporting baseline eCQMs. Reporting valid, trusted, and actionable eCQMs takes even more time and effort. The ENSW project provided practices with a significant amount of technical support to facilitate eCQM reporting, including individualized help from a clinical health information technology advisor, peer learning networks, online measurement guides, and access to technical assistance. Programs that provide less support would likely encounter greater delays in eCQM reporting.

    Conclusions

    Nearly a decade has passed since the HITECH Act was enacted, and our project that focused on small- to medium-sized practices highlights a success and a failure of that policy. Nearly all of the practices used Meaningful Use–certified EHRs. That is a major success. However, the inability to use those EHRs to quickly track and report on quality is a major failure. The ability to readily access and report trustworthy eCQM data has become an essential competency of primary care practice teams. Beyond the external reporting requirements, practices’ ability to use quality data to monitor and improve their performance is essential. Despite years of on-the-ground and systems-level work, our experience shows that eCQM reporting still takes a great deal of time and effort. As the health care system increasingly moves to value-based structures that require eCQMs, some practices may be left behind without better incentives and support. Health care leaders, policy makers, EHR vendors and technical assistance providers should continue their efforts to reduce the burden of eCQM reporting and improve data capacity in primary care practices.

    Back to top
    Article Information

    Accepted for Publication: June 17, 2019.

    Published: August 7, 2019. doi:10.1001/jamanetworkopen.2019.8569

    Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2019 Knierim KE et al. JAMA Network Open.

    Corresponding Author: Kyle E. Knierim, MD, Department of Family Medicine, University of Colorado Anschutz Medical Campus, 12631 E 17th Ave, Aurora, CO 80045 (kyle.knierim@ucdenver.edu).

    Author Contributions: Drs Knierim and L. M. Dickinson had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

    Concept and design: Knierim, L. M. Dickinson, Nease, de la Cerda, W. P. Dickinson.

    Acquisition, analysis, or interpretation of data: Knierim, Hall, L. Miriam Dickinson, Nease, Fernald, Bleecker, Rhyne, W. P. Dickinson.

    Drafting of the manuscript: Knierim, Hall, L. M. Dickinson, Nease, de la Cerda, Fernald, W. P. Dickinson.

    Critical revision of the manuscript for important intellectual content: Knierim, L. M. Dickinson, Nease, Bleecker, Rhyne, W. P. Dickinson.

    Statistical analysis: L. M. Dickinson.

    Obtained funding: W. P. Dickinson.

    Administrative, technical, or material support: Knierim, Hall, de la Cerda, Fernald, Bleecker, Rhyne.

    Supervision: Knierim, Nease, Rhyne, W. P. Dickinson.

    Conflict of Interest Disclosures: Dr L. M. Dickinson reported grants from the Agency for Healthcare Research and Quality (AHRQ) during the conduct of the study; and grants from the National Institutes of Health, AHRQ, and the Patient-Centered Outcomes Research Institute outside the submitted work. Dr Nease reported grants from AHRQ during the conduct of the study. Mr Fernald reported grants from AHRQ during the conduct of the study. Dr Rhyne reported grants from AHRQ during the conduct of the study. Dr W. P. Dickinson reported grants from AHRQ during the conduct of the study. No other disclosures were reported.

    Funding/Support: Funding for this work was provided by AHRQ grant R18 HS023904.

    Role of the Funder/Sponsor: The funder had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

    Disclaimer: This work represents the opinions of the authors and should not be interpreted as official positions of the AHRQ or the US Department of Health and Human Services.

    Additional Contributions: We thank Stephanie Kirchner, MSPH, RD (University of Colorado), Andrew Bienstock, MHA (University of Colorado), Daniel Pacheco, MBA (University of Colorado), and Wilson Pace, MD (DARTNet Institute), for their contributions to project management, review of study design, and support for practice entry of quality measures. Elizabeth W. Staton, MSTC (University of Colorado), assisted with copy editing. These individuals were compensated by study funding from AHRQ.

    References
    1.
    Jha  AK.  Meaningful use of electronic health records: the road ahead.  JAMA. 2010;304(15):1709-1710. doi:10.1001/jama.2010.1497PubMedGoogle ScholarCrossref
    2.
    Blumenthal  D, Tavenner  M.  The “meaningful use” regulation for electronic health records.  N Engl J Med. 2010;363(6):501-504. doi:10.1056/NEJMp1006114PubMedGoogle ScholarCrossref
    3.
    Centers for Disease Control and Prevention. Public health and promoting interoperability programs (formerly, known as electronic health records meaningful use). https://www.cdc.gov/ehrmeaningfuluse/introduction.html. Accessed January 21, 2019.
    4.
    Panzer  RJ, Gitomer  RS, Greene  WH, Webster  PR, Landry  KR, Riccobono  CA.  Increasing demands for quality measurement.  JAMA. 2013;310(18):1971-1980. doi:10.1001/jama.2013.282047PubMedGoogle ScholarCrossref
    5.
    Petersen  LA, Woodard  LD, Urech  T, Daw  C, Sookanan  S.  Does pay-for-performance improve the quality of health care?  Ann Intern Med. 2006;145(4):265-272. doi:10.7326/0003-4819-145-4-200608150-00006PubMedGoogle ScholarCrossref
    6.
    Epstein  AM, Lee  TH, Hamel  MB.  Paying physicians for high-quality care.  N Engl J Med. 2004;350(4):406-410. doi:10.1056/NEJMsb035374PubMedGoogle ScholarCrossref
    7.
    Casalino  LP, Gans  D, Weber  R,  et al.  US physician practices spend more than $15.4 billion annually to report quality measures.  Health Aff (Millwood). 2016;35(3):401-406. doi:10.1377/hlthaff.2015.1258PubMedGoogle ScholarCrossref
    8.
    Blumenthal  D, McGinnis  JM.  Measuring Vital Signs: an IOM report on core metrics for health and health care progress.  JAMA. 2015;313(19):1901-1902. doi:10.1001/jama.2015.4862PubMedGoogle ScholarCrossref
    9.
    Kern  LM, Malhotra  S, Barrón  Y,  et al.  Accuracy of electronically reported “meaningful use” clinical quality measures: a cross-sectional study.  Ann Intern Med. 2013;158(2):77-83. doi:10.7326/0003-4819-158-2-201301150-00001PubMedGoogle ScholarCrossref
    10.
    Chan  KS, Fowles  JB, Weiner  JP.  Review: electronic health records and the reliability and validity of quality measures: a review of the literature.  Med Care Res Rev. 2010;67(5):503-527. doi:10.1177/1077558709359007PubMedGoogle ScholarCrossref
    11.
    Heisey-Grove  DM, Wall  HK, Wright  JS.  Electronic clinical quality measure reporting challenges: findings from the Medicare EHR Incentive Program’s Controlling High Blood Pressure measure.  J Am Med Inform Assoc. 2018;25(2):127-134. doi:10.1093/jamia/ocx049PubMedGoogle ScholarCrossref
    12.
    Fernald  DH, Wearner  R, Dickinson  WP.  The journey of primary care practices to meaningful use: a Colorado Beacon Consortium study.  J Am Board Fam Med. 2013;26(5):603-611. doi:10.3122/jabfm.2013.05.120344PubMedGoogle ScholarCrossref
    13.
    Roth  CP, Lim  Y-W, Pevnick  JM, Asch  SM, McGlynn  EA.  The challenge of measuring quality of care from the electronic health record.  Am J Med Qual. 2009;24(5):385-394. doi:10.1177/1062860609336627PubMedGoogle ScholarCrossref
    14.
    Cohen  DJ, Dorr  DA, Knierim  K,  et al.  Primary care practices’ abilities and challenges in using electronic health record data for quality improvement.  Health Aff (Millwood). 2018;37(4):635-643. doi:10.1377/hlthaff.2017.1254PubMedGoogle ScholarCrossref
    15.
    Rao  SR, Desroches  CM, Donelan  K, Campbell  EG, Miralles  PD, Jha  AK.  Electronic health records in small physician practices: availability, use, and perceived benefits.  J Am Med Inform Assoc. 2011;18(3):271-275. doi:10.1136/amiajnl-2010-000010PubMedGoogle ScholarCrossref
    16.
    Meyer  GS, Nelson  EC, Pryor  DB,  et al.  More quality measures versus measuring what matters: a call for balance and parsimony.  BMJ Qual Saf. 2012;21(11):964-968. doi:10.1136/bmjqs-2012-001081PubMedGoogle ScholarCrossref
    17.
    Conway  PH, Mostashari  F, Clancy  C.  The future of quality measurement for improvement and accountability.  JAMA. 2013;309(21):2215-2216. doi:10.1001/jama.2013.4929PubMedGoogle ScholarCrossref
    18.
    Phillips  R.  The PRIME Registry helps thousands of primary care clinicians liberate EHR data and prepare for MIPS.  J Am Board Fam Med. 2017;30(4):559. doi:10.3122/jabfm.2017.04.170193Google ScholarCrossref
    19.
    Jortberg  BT, Fernald  DH, Dickinson  LM,  et al.  Curriculum redesign for teaching the PCMH in Colorado Family Medicine Residency programs.  Fam Med. 2014;46(1):11-18.PubMedGoogle Scholar
    20.
    Sessums  LL, McHugh  SJ, Rajkumar  R.  Medicare’s vision for advanced primary care: new directions for care delivery and payment.  JAMA. 2016;315(24):2665-2666. doi:10.1001/jama.2016.4472PubMedGoogle ScholarCrossref
    21.
    Centers for Medicare & Medicaid Services. Transforming Clinical Practice Initiative. https://innovation.cms.gov/initiatives/Transforming-Clinical-Practices. Accessed January 21, 2019.
    22.
    L&M Policy Research.  Innovation Center State-Based Initiatives: A Systematic Review of Lessons Learned. Baltimore, MD: Centers for Medicare & Medicaid Services; 2018.
    23.
    English  AF, Dickinson  LM, Zittleman  L,  et al.  A community engagement method to design patient engagement materials for cardiovascular health.  Ann Fam Med. 2018;16(suppl 1):S58-S64. doi:10.1370/afm.2173PubMedGoogle ScholarCrossref
    24.
    Ogrinc  G, Davies  L, Goodman  D, Batalden  P, Davidoff  F, Stevens  D.  SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): revised publication guidelines from a detailed consensus process.  BMJ Qual Saf. 2016;25(12):986-992. doi:10.1136/bmjqs-2015-004411PubMedGoogle ScholarCrossref
    25.
    eCQI Resource Center. Ischemic vascular disease (IVD): use of aspirin or another antiplatelet. https://ecqi.healthit.gov/ecqm/measures/cms164v5. Updated May 16, 2019. Accessed August 16, 2018.
    26.
    eCQI Resource Center. Controlling high blood pressure. https://ecqi.healthit.gov/ecqm/measures/cms165v6. Accessed June 26, 2019.
    27.
    eCQI Resource Center. Statin therapy for the prevention and treatment of cardiovascular disease. https://ecqi.healthit.gov/ep/ecqms-2018-performance-period/statin-therapy-prevention-and-treatment-cardiovascular-disease. Updated May 16, 2019. Accessed August 16, 2018.
    28.
    eCQI Resource Center. Preventive care and screening: tobacco use: screening and cessation intervention. https://ecqi.healthit.gov/ecqm/measures/cms138v6. Accessed August 16, 2018.
    29.
    Wright  JS, Wall  HK, Briss  PA, Schooley  M.  Million hearts—where population health and clinical practice intersect.  Circ Cardiovasc Qual Outcomes. 2012;5(4):589-591. doi:10.1161/CIRCOUTCOMES.112.966978PubMedGoogle ScholarCrossref
    30.
    Cohen  DJ, Balasubramanian  BA, Gordon  L,  et al.  A national evaluation of a dissemination and implementation initiative to enhance primary care practice capacity and improve cardiovascular disease care: the ESCALATES study protocol.  Implement Sci. 2016;11(1):86. doi:10.1186/s13012-016-0449-8PubMedGoogle ScholarCrossref
    31.
    DARTNet Institute. DARTNet website. http://www.dartnet.info/. Accessed August 16, 2018.
    32.
    United States Department of Agriculture. Rural-Urban Commuting Area Codes. https://www.ers.usda.gov/data-products/rural-urban-commuting-area-codes/. Accessed August 16, 2018.
    33.
    Hosmer  D, Lemeshow  S, Sturdivant  R. Model-building strategies and methods for logistic regression. In: Shewhart  WA, Wilks  SS, eds.  Applied Logistic Regression. Hoboken, NJ: John Wiley & Sons; 2000. doi:10.1002/0471722146
    34.
    Fleming  NS, Culler  SD, McCorkle  R, Becker  ER, Ballard  DJ.  The financial and nonfinancial costs of implementing electronic health records in primary care practices.  Health Aff (Millwood). 2011;30(3):481-489. doi:10.1377/hlthaff.2010.0768PubMedGoogle ScholarCrossref
    35.
    Menachemi  N, Collum  TH.  Benefits and drawbacks of electronic health record systems.  Risk Manag Healthc Policy. 2011;4:47-55. doi:10.2147/RMHP.S12985PubMedGoogle ScholarCrossref
    36.
    Terry  AL, Thorpe  CF, Giles  G,  et al.  Implementing electronic health records: key factors in primary care.  Can Fam Physician. 2008;54(5):730-736.PubMedGoogle Scholar
    37.
    Buscaj  E, Hall  T, Montgomery  L,  et al.  Practice facilitation for PCMH implementation in residency practices.  Fam Med. 2016;48(10):795-800.PubMedGoogle Scholar
    38.
    Friedberg  MW, Coltin  KL, Safran  DG, Dresser  M, Zaslavsky  AM, Schneider  EC.  Associations between structural capabilities of primary care practices and performance on selected quality measures.  Ann Intern Med. 2009;151(7):456-463. doi:10.7326/0003-4819-151-7-200910060-00006PubMedGoogle ScholarCrossref
    39.
    DesRoches  CM, Audet  AM, Painter  M, Donelan  K.  Meeting meaningful use criteria and managing patient populations: a national survey of practicing physicians.  Ann Intern Med. 2013;158(11):791-799. doi:10.7326/0003-4819-158-11-201306040-00003PubMedGoogle ScholarCrossref
    40.
    Miller  RH, Sim  I.  Physicians’ use of electronic medical records: barriers and solutions.  Health Aff (Millwood). 2004;23(2):116-126. doi:10.1377/hlthaff.23.2.116PubMedGoogle ScholarCrossref
    41.
    Ryan  AM, Bishop  TF, Shih  S, Casalino  LP.  Small physician practices in New York needed sustained help to realize gains in quality from use of electronic health records.  Health Aff (Millwood). 2013;32(1):53-62. doi:10.1377/hlthaff.2012.0742PubMedGoogle ScholarCrossref
    42.
    Jones  SS, Rudin  RS, Perry  T, Shekelle  PG.  Health information technology: an updated systematic review with a focus on meaningful use.  Ann Intern Med. 2014;160(1):48-54. doi:10.7326/M13-1531PubMedGoogle ScholarCrossref
    43.
    Goetz Goldberg  D, Kuzel  AJ, Feng  LB, DeShazo  JP, Love  LE.  EHRs in primary care practices: benefits, challenges, and successful strategies.  Am J Manag Care. 2012;18(2):e48-e54.PubMedGoogle Scholar
    44.
    Kanger  C, Brown  L, Mukherjee  S, Xin  H, Diana  ML, Khurshid  A.  Evaluating the reliability of EHR-generated clinical outcomes reports: a case study.  EGEMS (Wash DC). 2014;2(3):1102.PubMedGoogle Scholar
    45.
    Berg  M.  Implementing information systems in health care organizations: myths and challenges.  Int J Med Inform. 2001;64(2-3):143-156. doi:10.1016/S1386-5056(01)00200-3PubMedGoogle ScholarCrossref
    46.
    Dickinson  LM, Dickinson  WP, Nutting  PA,  et al.  Practice context affects efforts to improve diabetes care for primary care patients: a pragmatic cluster randomized trial.  J Gen Intern Med. 2015;30(4):476-482. doi:10.1007/s11606-014-3131-3PubMedGoogle ScholarCrossref
    47.
    Crabtree  BF, Nutting  PA, Miller  WL,  et al.  Primary care practice transformation is hard work: insights from a 15-year developmental program of research.  Med Care. 2011;49(suppl):S28-S35. doi:10.1097/MLR.0b013e3181cad65cPubMedGoogle ScholarCrossref
    48.
    Hemler  JR, Hall  JD, Cholan  RA,  et al.  Practice facilitator strategies for addressing electronic health record data challenges for quality improvement: EvidenceNOW.  J Am Board Fam Med. 2018;31(3):398-409. doi:10.3122/jabfm.2018.03.170274PubMedGoogle ScholarCrossref
    49.
    Heisey-Grove  D, King  JA.  Physician and practice-level drivers and disparities around meaningful use progress.  Health Serv Res. 2017;52(1):244-267. doi:10.1111/1475-6773.12481PubMedGoogle ScholarCrossref
    50.
    Rittenhouse  DR, Ramsay  PP, Casalino  LP, McClellan  S, Kandel  ZK, Shortell  SM.  Increased health information technology adoption and use among small primary care physician practices over time: a national cohort study.  Ann Fam Med. 2017;15(1):56-62. doi:10.1370/afm.1992PubMedGoogle ScholarCrossref
    51.
    Hsiao  CJ, Hing  E.  Use and characteristics of electronic health record systems among office-based physician practices: United States, 2001-2013.  NCHS Data Brief. 2014;(143):1-8.PubMedGoogle Scholar
    52.
    Kruse  CS, DeShazo  J, Kim  F, Fulton  L.  Factors associated with adoption of health information technology: a conceptual model based on a systematic review.  JMIR Med Inform. 2014;2(1):e9. doi:10.2196/medinform.3106PubMedGoogle ScholarCrossref
    ×