Assessment of Accuracy and Usability of a Fee Estimator for Ambulatory Care in an Integrated Health Care Delivery Network

IMPORTANCE Given increased enrollment in high-deductible health insurance plans and mandates from the Patient Protection and Affordable Care Act, individualized price transparency tools are needed. OBJECTIVE To assess accuracy and initial user experience of a cost estimation tool for ambulatory procedures delivered via an online patient portal and informed by real-time data feeds from third-party payers. DESIGN, SETTING, AND PARTICIPANTS This quality improvement study included patients aged 18 years and older at an integrated health care system in Northern California. Data from patients who used the cost estimator tool from August 21, 2018, to April 9, 2019, and who had matching explanation of benefits statements were used to assess accuracy of the tool. User experience was assessed with a brief survey completed online or via postal mail. Data were analyzed from April 15, 2019, to October 11, 2019.


Introduction
Health care has been slow to embrace price transparency. The Patient Protection and Affordable Care Act mandates price transparency, and a new Centers for Medicare & Medicaid Services rule requires hospitals to publish their fee schedule on their websites. 1 This rule has led to widespread comment that it fails to achieve the objective of genuine price transparency because of wide variation in patients' yearly deductibles, coinsurance, and contracted rates with health insurers relative to list prices. 2 Recent estimates indicate that 46% of individuals in the United States younger than 65 years are covered through high-deductible health plans. 3 Continued growth of these plans underlines the importance of effective tools to enable patients to compare prices and quality of services from different health care networks. Tools are available from commercial companies (eg, FairHealth), 4 public websites (eg, CMS Hospital Compare), and employers and health insurance plans, 5 but overall uptake remains low. 6,7 Barriers to use include lack of knowledge, poor user interfaces, and inaccurate estimations. 7 Easy-to-find, user-friendly, personalized cost estimation tools that account for patient's insurance coverage are essential to address unmet needs.
To our knowledge, this is the first study to report on the development, accuracy, and initial user experiences of a cost estimator tool for ambulatory procedures delivered via an online patient portal and informed by real-time data feeds from third-party payers.

Methods
The Sutter Health institutional review board approved this study. Survey respondents provided written informed consent. A waiver of consent was obtained to use the data from cost estimator tool users who did not respond to the survey owing to the minimal risk of the study. This study is reported following the Standards for Quality Improvement Reporting Excellence (SQUIRE), American Association for Public Opinion Research (AAPOR), and Standards for Reporting Qualitative Research (SRQR) reporting guidelines.
Sutter Health is an integrated health care system in Northern California serving more than 3 million patients. An early adopter of patient-centered approaches to health care delivery, Sutter Health was the first health care system in the nation to implement MyChart, Epic Systems' vendorbased patient portal, branded as My Health Online (MHO). 8,9 As of 2018, approximately 79% of patients of Sutter Health ambulatory care are enrolled in MHO. 10 Sutter Health created the Consumer Accessible Fee Estimation initiative to provide price transparency for patients to use of their own initiative. The initial cost estimator tool implementation includes 220 common services from Sutter Health's top 10 insurance payers by volume. The patient can search for services by key words, by Current Procedural Terminology code, or by choosing from a list within categories. These categories include immunizations and vaccines, laboratory tests, heart or lung tests, office visits, specialist consultations, and imaging services (eg, magnetic resonance imaging, computed tomography scans, mammography, radiographic imaging, ultrasonographic imaging). When a patient initiates a query about a Current Procedural Terminology code or selects from an established estimate template, the service cost is calculated from the fee schedule in the electronic health record system. The contracted rate is calculated from historical data or contracts held by the system. Final patient responsibility is calculated leveraging service cost, contracted rate, and a personalized out-of-pocket cost calculation inclusive of copay, coinsurance, and deductibles, derived from a real-time query sent to the payer. Development of this tool required a total of 7000 hours during an 18-month period by 7 fulltime employees. The primary challenge in development was data mapping issues related to meeting health care practitioners, insurers, and other health care adjudication processors, while the 271 Transaction Set is the response mechanism for these inquiries. 11 In addition to current enrollment status, these transaction sets also contain fields that include deductibles, copays, and coinsurances.
However, some of these values (eg, copays and deductibles) are service-dependent and can thus be applied to encounters differently based on payer, plan, visit type, or service type. An additional source of variation is that while Electronic Data Interchange standards do require discrete data fields within a defined record format, they do not mandate the standard values for those fields. For example, one organization may categorize a procedure as "radiology services" while a different receiving organization may classify that same procedure as "imaging services." To ensure correct data mapping, Sutter Health information analysts needed to analyze each transaction set for all 220 services for all payers, plans, and benefits to determine various irregularities and to appropriately account for these within the cost estimator.
An additional challenge in the cost estimator tool development regards issues with data aggregation. Health care transactions are typically processed by a myriad of local, regional, and national clearinghouses. As transactions are transferred from vendor to vendor, they are repeatedly collected, reformatted, and transmitted, which increases costs, delays, and errors. Senior leaders at Sutter Health invested in building collaborative relationships and data sharing arrangements with industry partners. Sutter Health partnered with several third-party insurance payers and an eligibility vendor to facilitate the information exchange necessary to develop fee estimates. These partnerships enabled the flowing of data necessary to ensure accurate data mapping for the cost estimator tool.
The patient cost estimator tool launched on June 1, 2018. We present the overall number of estimates from initiation through April 30, 2019, and we present accuracy rates through April 9, 2019.
We designed a short, 16-question, self-administered patient survey to understand patient experience with the cost estimator for overall usability, satisfaction, loyalty, and suggestions for improvements. We did not pilot test the survey because several questions came from existing validated measures. [12][13][14] Our nonprobability sample included a subset of users, including current patients of Sutter Health 18 years and older who had matching explanation of benefits (EOB) statements from August 21, 2018, to April 9, 2019. Our sample only included those with an EOB because we wanted to verify the accuracy of the tool estimates. Claims can be submitted to the payer for reimbursement only if the service is performed and completed. Once the claim is processed by the payer, an EOB is received and loaded into the Sutter Health electronic health records system to compare it with the estimate. We sent the study invitation and link to an online REDCap survey (Vanderbilt University) through MHO to eligible patients. After 2 weeks, we mailed nonrespondents a paper version of the survey. Respondents who completed the survey received a $20 gift card. Data collection occurred from April 16, 2019, to June 10, 2019.

Statistical Analysis
Descriptive statistics are presented for survey data. Survey respondents reported their race/ethnicity to determine potential differences across groups and estimate accuracy. We defined accuracy as a difference between estimate and billed amount of less than $10 or 5%. Fisher exact test was used to compare categorical variables with 2-tailed α set at .05. Data were analyzed using SAS version 9.   contacted their clinician about the estimate to discuss potential options and verify cost ( Table 2).

JAMA Network Open | Health Policy
In response to what patients liked about the cost estimator tool, 53 participants (42.4%) reported being able to anticipate costs ( Table 3). One patient wrote, "It [the cost estimator tool]…gave a quick estimate of the cost of a service which I had never had before. It's a relief knowing in advance how much you will be spending on healthcare." Thirty-five respondents (28.0%) with accurate or inaccurate estimates responded that the tool was easy to use and had a straightforward and intuitive interface.
The most frequently received improvement suggestion was to expand the procedures and services for estimates, given by 35 respondents (28.0%). However, 26 respondents (20.8%) expressed that no changes were needed because the tool worked as promised. One patient indicated that "At this time, I feel there are no improvements needed. The estimator gave me the approximate cost of my ultrasound which helped me in my financial planning." Other suggestions for improvement included additional financial information that included coinsurance and deductibles.

Discussion
This quality improvement study reported that the implementation of the patient cost estimator tool delivered more than 3500 online cost estimates in a 10-month period. For individuals with an EOB, the tool provided estimates with 83.9% accuracy. Initial survey results found that most respondents felt favorable about the overall experience, would use the tool again, would recommend it to others, and had an improved perception of their care at Sutter Health. A small minority of individuals contacted their clinician about estimates, suggesting that many consumers understand that their clinician does not have access to complex insurance information. 15 While more than half of participants indicated the importance of knowing potential costs, few ultimately changed their decision because of the cost estimate. This is similar to the finding of a 2016 study 6 that there was no association between price transparency tool use and outpatient cost savings. These findings suggest price transparency will not discourage patients from receiving health care services and that federal policies to increase price transparency alone may not be sufficient for delivering significant cost savings. However, while respondents did not report changing their health care decisions, they did report benefiting from the tool in fiscal planning. Thus, the overall value of the cost estimator to patients could be more difficult to measure, as they may be using other strategies, such as delaying care, placing money into a health savings account, or obtaining services elsewhere. Developing a useful and accurate fee estimation tool is not trivial. Traditional fee estimation tools allow patients to query fee schedules but are not informed by the patient's individualized health care benefit design. Despite Sutter Health collaborating with industry partners, our findings suggest that opportunities to further improve the quality of fee estimates exist, particularly with reduction of the variation of data sent by payers, access to a wider number of procedures, and improved timeliness of data updates to inform estimates.

Limitations
This study has several limitations. This study included a small subset of the overall tool users (<10% of the total estimates created) who had an EOB. Of individuals invited to participate, a modest sample of tool users responded to the survey, and we do not know about the experience of nonrespondents or individuals who were not invited to complete the survey because we did not have an EOB for them. Additionally, participant responses are subject to recall bias because it may have been months since they used the cost estimator. The degree to which our experiences are generalizable to other health systems is not clear.

Conclusions
While others have reported on the development and use of a cost estimator tool, 7,15,16