Primary Care Practices’ Ability to Report Electronic Clinical Quality Measures in the EvidenceNOW Southwest Initiative to Improve Heart Health

Key Points Question How quickly can primary care practices report electronic clinical quality measures based on evidence-based guidelines for cardiac care? Findings In this quality improvement study of 211 primary care practices, the median time to report any baseline electronic clinical quality measure was 8.2 months. Time to report varied by measure type and practice characteristics. Meaning This study suggests that clinical quality measure reporting still takes a great deal of time and effort, and as the health care system increasingly moves to value-based structures that require electronic clinical quality measures, some practices may be left behind without better incentives and support.


Introduction
The Health Information Technology for Economic and Clinical Health (HITECH) Act, passed in 2009 as a part of the American Recovery and Reinvestment Act, specified general guidelines for the development and implementation of a "nationwide health information technology infrastructure." 1 Through HITECH Act initiatives, the federal government has spent significant time and money to promote widespread adoption of electronic health records (EHRs) that were intended to improve the quality, safety, efficiency, coordination, and equity of health care in the United States. 2,3Among other purposes, EHRs were to offer a standardized platform to better demonstrate gains in these domains.A key feature of the infrastructure was promotion of clinical quality with reporting measures that would be collected and reported using certified EHR systems.
The increasing prevalence of EHRs has prompted the electronic extraction of clinical quality measures (eCQMs) to become the standard for quality reporting programs.Reporting burden has grown over time, with increasing requirements to report eCQMs for a variety of quality initiatives 4 and value-based reimbursement structures. 5,6This growing burden has major implications for resource allocation: estimates suggest that the time primary care team members spend on eCQM reporting equates to billions of dollars per year. 7Thousands of eCQMs have been developed by independent groups, with different ones used in various governmental or payer initiatives, leading to confusion and fatigue on the part of health care practices. 80][11][12] Furthermore, the extent to which eCQMs correspond to quality care and improved outcomes has been questioned. 13Variable data documentation practices greatly affect data completeness and reliability. 10,12Barriers to eCQM reporting and meaningful use of data include the time and effort required to implement reporting processes, resistance to change, limited EHR reporting functionality, costs, inflexible reporting criteria, inconsistency between measures and clinical guidelines, and vendors who were unreceptive to requests for flexible EHR configuration. 12,14all practices may be more likely to experience financial barriers related to EHR adoption and use. 15riation in definition of measures, data sources, and data formats may limit the comparability and utility of quality measures across practices. 13number of efforts have aimed to reduce the burden of measure reporting on practices by increasing the adoption and meaningful use of health information technology, identifying and addressing gaps in primary care teams' data skills, focusing on measures that matter, 16 improving clarity of measure specifications, and aligning measures across settings and outcomes. 17Professional societies have supported the use of data analytics platforms like PRIME Registry. 18[22] The EvidenceNOW Southwest (ENSW) project offered an opportunity to see whether primary care practices have developed capacity to produce eCQMs.The ENSW project is a collaborative effort between Colorado and New Mexico covering the diverse geographic and cultural regions of both states.It is 1 of 7 regional cooperatives funded by the Agency for Healthcare Research and Quality's (AHRQ) EvidenceNOW research study that started in 2015 to help small-and medium-sized primary In this study, we sought to take advantage of the opportunity presented by the ENSW project's use of 4 common and standardized eCQMs to examine how quickly primary care practices could report on these eCQMs.Our hypothesis was that, nearly 10 years following the HITECH Act, many primary care practices still do not possess the skills and tools to easily meet basic eCQM reporting requirements and that practices with certain characteristics experience greater delays than others when reporting eCQMs.

Methods
Practice recruitment and selection for participation in ENSW has been described elsewhere. 23The ENSW project and the study described in this article were approved by the Colorado Multiple Institutional Review Board and the University of New Mexico Human Research Protections Office.
The ENSW project is registered on ClinicalTrials.gov(NCT02515578).Participants completing surveys were provided written information about the study.The need to document consent was waived by the human subjects review boards because they determined that the research presented no more than minimal risk of harm to participants and involved no procedures for which written consent was required outside of the research context.All participants were provided with an informed consent document in the form of an information sheet explaining the research aims, patient rights, and potential risks.This report follows the Standards for Quality Improvement Reporting Excellence (SQUIRE) reporting guideline. 24

Measure Selection and Practice Support
The AHRQ selected the measures of aspirin use, 25 blood pressure control, 26 cholesterol management, 27 and smoking cessation 28 (ABCS) to advance heart health in alignment with the Million Hearts Campaign, 29 the National Quality Forum, and the Centers for Medicare & Medicaid Services.The AHRQ selected standard eCQM specifications for use by all practices participating in the 7 cooperatives. 30These specifications included a 12-month measurement period for each quarterly report.
Recognizing potential challenges to eCQM reporting, in addition to receiving 9 months of ongoing practice transformation support from a trained practice facilitator, ENSW provided practices with support from a clinical health information technology advisor and resources and support from the research team, which had experience collecting eCQMs.This support team assisted practices with developing and managing workflows for data collection, reporting, and analysis; helped with the entry of eCQMs into the reporting website; and linked practices with other technical assistance resources as needed and available.Practices were instructed to report their baseline ABCS eCQMs as soon as possible once their practice transformation support began.

eCQM Reporting Mechanisms
The ENSW project offered practices several options to report eCQM data to a centralized repository.
The first option allowed practices to calculate eCQM numerators and denominators using an internal EHR or registry.These data were manually entered through an online portal.The other option allowed practices to securely transfer patient-level information to the DARTNet Institute

Time to Report
The primary outcome measure for our analyses was time to reporting.We calculated time to report as a measurement of time in days (converted to months to aid interpretability) from the date of the practice's kickoff meeting with ENSW transformation support staff to submission of baseline eCQM data for each of the ABCS measures.Using the ENSW kickoff date ensured a discretely recorded, objective time zero uniformly available for all practices in the study.

Statistical Analysis
Descriptive statistics were generated for practice characteristics (eg, frequencies, proportions, mean, standard deviation).The outcome variables for all analyses are time to reporting for each eCQM and time to reporting for the first eCQM reported.Practices that had not reported by the end of the assessment period for this analysis (November 1, 2017) were censored as of that date.Practices had a minimum of 7.7 months from the time the practice first received transformation support to the end of the assessment period.The log-rank test was used to generate product-limit curves and compare survival distributions across the measures.For blood pressure and cholesterol, Cox proportional hazards regression models were used to examine practice characteristics that were associated with after the kickoff meeting were excluded from analysis (n = 6); practices that dropped out more than 1 month after kickoff and had not reported measures (n = 3) were censored at the time of dropout.
Backward elimination was used to arrive at the final multivariate models, initially including all variables that were significant at P < .10 and eliminating variables 1 at a time until all were P < .05. 33 The threshold for statistical significance of results was P < .05using 2-sided tests.Because of the variable length of assessment periods for practices, sensitivity analyses were performed limiting the observation period to a maximum of 12 months to determine whether there was bias associated with longer observation time for some practices enrolled earlier.All analyses were performed using SAS statistical software version 9.4 (SAS Institute Inc).

Results
Data represent 211 enrolled practices that provided survey and eCQM data between January 1, 2015, and November 1, 2017.Table 1 details the characteristics of participating practices.Most practices (75%) were in Colorado.Practices were predominantly clinician owned (48%), located in urban or suburban areas (71%), and used at least 1 patient registry (68%) at baseline.The mean (SD) practice size was 3.5 (2.6) clinicians.Approximately 47% of practices reported participating in some type of ACO.A substantial majority (85%) calculated eCQMs using their EHR or internal registry.

Time to Report
The median (interquartile range [IQR]) time to report any measure was 8.2 (4.6-11.9)months.The median (IQR) time to report varied across measures from a minimum of 7.8 (3.5-10.4)months for the blood pressure measure to a maximum of 10.5 (6.6 to >12) months for the cholesterol measure (Table 2).Few practices reported measures within 6 months, ranging from a minimum of 22.8% of practices for the cholesterol measure to a maximum of 34.6% of practices for the blood pressure measure.
The Practices in the study used more than 13 different EHRs encompassing 48 different EHR versions, with the most common EHR, NextGen, used by 28% of practices.Because of the relatively small number of practices using any particular EHR, we did not assess these in Cox regression models.We provide median time to report for EHRs used by more than 5 practices in the eTable in the Supplement.

Practice Characteristics Associated With Ability to Report Certain eCQMs
Hazard ratios (HRs) from the univariable Cox proportional hazards models are shown in Table 3 for the blood pressure and cholesterol measures.Results for aspirin and smoking cessation measures are not presented because the patterns were similar to blood pressure results.
Practice characteristics associated with greater ability to report eCQMs varied between the blood pressure and cholesterol measures.Earlier ability to report the blood pressure measure was associated with ownership by clinicians (HR, 1.42; 95% CI, 1.04-1.93)or hospitals (HR, 2.41; 95% CI, 1.58-3.66)vs FQHC, larger size (HR, 1.06; 95% CI, 1.01-1.12),ACO participation (HR, 1.94; 95% CI, The median (interquartile range) time to report any measure was 8.2 (4.6-11.9)months, with a minimum of 7.8 months for the blood pressure measure to a maximum of 10.5 months for the cholesterol measure.Abbreviation: HR, hazard ratio.
a Unadjusted HRs shown for any variable with P < .25 with 95% confidence limit (risk limits).
b Statistically significant at P < .05.
c Cardiovascular disease registries included a count of the following registries in use at the practice: ischemic vascular disease, hypertension, high cholesterol, diabetes, prevention services, and registries for high-risk patients.A score for adoption of cardiovascular disease guidelines for prevention and management was created by counting the following activities reported by a practice: guidelines posted or distributed, clinicians agreed on guidelines, standing orders created, or electronic health record prompts for each type of guideline.

Discussion
Our study sought to examine the current capacity of primary care practices to report 4 evidencebased eCQMs.Despite nearly all participating practices using Meaningful Use-certified EHRs and the provision of dedicated health information technology support by the ENSW project, primary care practices still required a substantial amount of time and support to report even well-established eCQMs.The ability to report ABCS eCQMs varied by measure type and practice characteristics.
Our results highlight how introducing new measures increases the reporting burden on practices.All 4 measures reflected current clinical care guidelines, but the aspirin, blood pressure management, and tobacco cessation measures were more established.Their specifications had been relatively stable, and they have been used for Centers for Medicare & Medicaid Services and other quality and research programs for many years.On the other hand, the cholesterol measure was chosen to reflect a very recent update to clinical guidelines.At the start of the ENSW project, there were no nationally recognized eCQM specifications for calculating the measure.Meaningful Use certification standards did not require the measure, and no payment program used the metric.
Compared with blood pressure management, the new cholesterol measure took nearly 3 months longer for the typical practice to report (7.8 months vs 10.5 months, respectively).
Our results also show that certain types of practices are more capable of prompt eCQM reporting.Practices that participate in ACOs and systematically use clinical guidelines seem better prepared to report established measures like the blood pressure measure and new measures like the cholesterol measure.Hospital-owned practices more quickly reported the established measures, and FQHCs more quickly reported the new cholesterol measure.While these associations do not imply causation, it would stand to reason that some combination of a practice's skill, previous activity, payment structures, and internal and external resources are leading to the variation in the time to reporting we observed.
5][36] Our findings indicate that these barriers do not end once EHR implementation is complete.For the practices we studied, substantial delays often continued well after the initial attempt to report measures.We found that it can take several months for a practice to produce any 1 of 4 standard measures.Implementing a new measure not previously adopted by federal programs like Meaningful Use takes practices even more time.We found a considerable time burden that health care teams face in reporting clinical quality measures, which builds on the previously reported estimate that physicians and their staff spend an average of 15 hours per week developing, collecting, and reporting external quality measures. 7r findings agree with others' conclusions that programs should try to align the amount and forms of health information technology support to best match practices' needs. 37This article complements the work of Cohen et al, 14 which looked at 1492 practices across the national EvidenceNOW project.Those practices reported on their ability to report on eCQMs at the outset of the project and the potential barriers to their use of EHR data for quality purposes.Our article adds to their findings by detailing the actual time to reporting for more than 200 participating practices.
Our results are consistent with previous research demonstrating modest but inconsistent associations between select structural elements of primary care practices and performance on various quality measures. 38Practice size was positively associated with ability to report the blood pressure eCQM, which aligns with evidence that smaller practices may experience greater barriers and delays in EHR use than larger practices, 39,40 perhaps suggesting that this disparity extends to the ability to report certain measures.Our findings also complement evidence that small practices need sustained and extensive EHR support to achieve improvement in quality measures. 41Beyond these studies, existing literature contains little information regarding the influence of contextual details with the use of health information technology. 42rriers to meaningfully implementing EHRs and using EHR data are manifold: costs, lack of knowledge of EHR functions, problems transforming office operations, lack of standardization, vendor system upgrades, lack of dedicated data coordinators, staff and clinician resistance, and fatigue. 12,43,44Reliably reporting individual measures may be further influenced by the interplay-and unpredictability-of multiple factors: that is, the "complexity of the sociotechnical networks at stake." 45 Primary care practices are complex adaptive systems, 46,47 and while our findings help identify specific practice characteristics that may be associated with quality measure reporting and performance, how these characteristics interact in any given practice is affected by the local landscape and factors beyond our ability to measure in this study.Practices using EHR data to inform quality improvement need ongoing and tailored support that can assist with addressing these complex factors. 48

Limitations
This study has several important limitations.Health information technology adoption can vary across regions and practice types, [49][50][51][52] so generalizability beyond these small-to medium-sized primary care practices in the Southwest United States may be limited.Numerous unmeasured factors may have influenced time to report, including degree of leadership engagement in ENSW, the costs to create reports in the different eCQM production tools, competing demands, and actual time spent trying to produce eCQMs.Measures produced with internal EHRs or registries were not independently verified for accuracy beyond basic validation checks (eg, numerator must be less than or equal to denominator), and many practices further refined data collection workflows and eCQM calculation processes after reporting baseline eCQMs.Reporting valid, trusted, and actionable eCQMs takes even more time and effort.The ENSW project provided practices with a significant amount of technical support to facilitate eCQM reporting, including individualized help from a clinical health information technology advisor, peer learning networks, online measurement guides, and access to technical assistance.Programs that provide less support would likely encounter greater delays in eCQM reporting.

Conclusions
Nearly a decade has passed since the HITECH Act was enacted, and our project that focused on smallto medium-sized practices highlights a success and a failure of that policy.Nearly all of the practices used Meaningful Use-certified EHRs.That is a major success.However, the inability to use those EHRs to quickly track and report on quality is a major failure.The ability to readily access and report trustworthy eCQM data has become an essential competency of primary care practice teams.
Beyond the external reporting requirements, practices' ability to use quality data to monitor and improve their performance is essential.Despite years of on-the-ground and systems-level work, our experience shows that eCQM reporting still takes a great deal of time and effort.As the health care system increasingly moves to value-based structures that require eCQMs, some practices may be left behind without better incentives and support.Health care leaders, policy makers, EHR vendors and technical assistance providers should continue their efforts to reduce the burden of eCQM reporting and improve data capacity in primary care practices.

Figure
plots the proportion of practices reporting each eCQM over time.Sensitivity analyses limited the maximum observation time frame to 12 months.Practices demonstrated a lower probability of reporting the cholesterol measure within the 12-month observation period (log-rank test for equality over strata: χ 2 3 = 41.42;P < .001).Practices that used the DARTNet Institute reported the cholesterol measure faster (median [IQR] time to report, 7.0 [4.5-9.7]months) than practices using their own EHR (median [IQR] time to report, 8.9[5.7-14.8]months) (log-rank P = .004).The times to report the 3 other measures were not significantly different.
Primary Care Practice Reporting of Electronic Clinical Quality Measures the latest evidence to improve cardiovascular health.The ENSW project built upon the efforts described to reduce eCQM reporting burden by choosing a minimum number of measures known to prolong life and improve health, matching reporting specifications as closely as possible to these clinical standards, accepting a variety of data sources (eg, EHRs, patient-level extracts, thirdparty registries), and offering robust technical assistance through onsite clinical health information technology advisors, access to regional experts, and linkages to AHRQ's national technical assistance contractor.

JAMA Network Open | Health Informatics Primary Care Practice Reporting of Electronic Clinical Quality Measures Practice Characteristics and Context Practice
32aracteristics were obtained from the baseline ENSW Practice Survey.At least 1 staff member, typically a practice administrator or lead clinician, completed the Practice Survey for each participating practice.The baseline Practice Survey gathered descriptive information on participating practices, including a series of questions surrounding use of strategies for improving patient care, such as quality improvement processes and patient self-management support.Practice size was defined as the number of clinicians working at that site.Practice zip code aligned to Rural-Urban Commuting Area codes was used to determine geographic area.We assigned practices with zip codes corresponding with Rural-Urban Commuting Area codes 1 to 4 as rural and 5 to 10 as urban or suburban.32Cardiovasculardisease (CVD) registries included a count of the following registries in use at the practice: ischemic vascular disease, hypertension, high cholesterol, diabetes, prevention services, and registries for high-risk patients.Total number of registries was Practice ownership was consolidated for these analyses to 3 categories:(1)clinician (including solo or group practices); (2) hospitals and academic centers (including academic health centers, faculty practices, hospital or health system practices, or health maintenance organizations); and (3) Federally Qualified Health Centers (FQHC) (including FQHCs, FQHC lookalike clinics, and Rural Health Clinics).translated into an ordered categorical variable (0, 1-2, 3-4, and 5-6).A score for adoption of CVD guidelines for prevention and management was created by counting the following activities reported by a practice: guidelines posted or distributed, clinicians' agreed-on guidelines, standing orders created, or EHR prompts for each type of guideline.Accountable care organization (ACO) member options included Medicaid, Medicare, private or commercial, and other.The survey also asked about major practice changes, including using a new or different EHR, moving to a new location, losing 1 or more clinicians, losing the office manager or head nurse, being purchased by or affiliating with a larger organization, or implementing a new billing system.Practices were asked if they previously participated in payment or quality demonstration programs including a State Innovation Model initiative, Comprehensive Primary Care Initiative, Transforming Clinical Practice Initiative, Community Health Worker training program, Blue Cross/Blue Shield Patient-Centered Medical Home program, Million Hearts State Learning Collaborative, Million Hearts Cardiovascular Disease Risk Reduction Model, or other program.Previous quality reporting support options included receiving help from any health information exchange, practice-based research network, clinical data warehouse, external consulting group, health system practice network, hospital network, primary care association, or regional extension center.Practice incentive or bonus payment options included the Medicare primary care incentive payment or the Medicare care coordination payment.

Table 2 .
Time to Report Aspirin Prescribing, Blood Pressure Control, Cholesterol Management, and Smoking Cessation Electronic Clinical Quality Measures

Table 3 .
Univariable Analyses of Practice Characteristics Associated With Less Time to Report Certain Electronic Clinical Quality Measures

Table 4 .
Final Multivariable Models of Select Practice Characteristics Associated With Ability to Report Electronic Clinical Quality Measures Abbreviations: HR, hazard ratio; NA, not applicable.aFinal multivariable models show Cox proportional hazards regression of select practice characteristics associated with ability to report electronic clinical quality measures.