Effect of a Documentation Improvement Program for an Academic Otolaryngology Practice | Head and Neck Cancer | JAMA Otolaryngology–Head & Neck Surgery | JAMA Network
[Skip to Navigation]
Sign In
Figure.  Documentation Flash Card
Documentation Flash Card

Flash cards were developed to help residents document qualifying comorbidities accurately. Contact information for our department’s Clinical Documentation Improvement specialist is obscured for privacy reasons. Comorbidities were written in black if they affected the All-Patient Refined Diagnosis Related Group, blue if they qualified as a comorbid diagnosis or complication (CC), and red if they qualified as a major comorbid diagnosis or complication (MCC). A/P indicates assessment and plan; CAD, coronary artery disease; CHF, congestive heart failure; CVA, cerebrovascular accident; D/C, discharge; DX, diagnosis; Fib, fibrillation; HNI-DRG, Head and Neck Institute diagnosis related group; H&P, history and physical; MI, myocardial infarction; POA, present on admission; TIA, transient ischemic attack.

Table 1.  Language Used in Day-to-Day Documentation
Language Used in Day-to-Day Documentation
Table 2.  Common Medicare Severity Diagnosis Related Groups
Common Medicare Severity Diagnosis Related Groups
1.
Rosenbaum  BP, Lorenz  RR, Luther  RB, Knowles-Ward  L, Kelly  DL, Weil  RJ.  Improving and measuring inpatient documentation of medical care within the MS-DRG system: education, monitoring, and normalized case mix index.  Perspect Health Inf Manag. 2014;11:1c.PubMedGoogle Scholar
2.
Mendez  CM, Harrington  DW, Christenson  P, Spellberg  B.  Impact of hospital variables on case mix index as a marker of disease severity.  Popul Health Manag. 2014;17(1):28-34.PubMedGoogle ScholarCrossref
3.
Centers for Medicare & Medicaid Services, Department of Health and Human Services. Acute care hospital inpatient prospective payment system: payment system fact sheet series. https://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNProducts/downloads/AcutePaymtSysfctsht.pdf. Published April 2013. Accessed February 28, 2016.
4.
Zalatimo  O, Ranasinghe  M, Harbaugh  RE, Iantosca  M.  Impact of improved documentation on an academic neurosurgical practice.  J Neurosurg. 2014;120(3):756-763.PubMedGoogle ScholarCrossref
5.
Koshy  S.  Documentation tips for pulmonary medicine: implications for the inpatient setting.  Chest. 2012;142(4):1035-1038.PubMedGoogle ScholarCrossref
6.
Hsia  DC, Krushat  WM, Fagan  AB, Tebbutt  JA, Kusserow  RP.  Accuracy of diagnostic coding for Medicare patients under the prospective-payment system.  N Engl J Med. 1988;318(6):352-355.PubMedGoogle ScholarCrossref
7.
Grogan  EL, Speroff  T, Deppen  SA,  et al.  Improving documentation of patient acuity level using a progress note template.  J Am Coll Surg. 2004;199(3):468-475.PubMedGoogle ScholarCrossref
8.
Spellberg  B, Harrington  D, Black  S, Sue  D, Stringer  W, Witt  M.  Capturing the diagnosis: an internal medicine education program to improve documentation.  Am J Med. 2013;126(8):739-743.e1. doi:10.1016/j.amjmed.2012.11.035.PubMedGoogle ScholarCrossref
9.
Zuidema  GD, Dans  PE, Dunlap  ED.  Documentation of care and prospective payment: one hospital’s experience.  Ann Surg. 1984;199(5):515-521.PubMedGoogle ScholarCrossref
10.
Olmsted MG, Geisen E, Murphy J, Bell D, Morley M. Methodology: U.S. News & World Report best hospitals 2014-15. http://www.usnews.com/pubfiles/BH_2014_Methodology_Report_Final_Jul14.pdf. Published July 14, 2014. Accessed February 28, 2016.
11.
Centers for Medicare & Medicaid Services. Hospital compare. https://www.medicare.gov/hospitalcompare/search.html. Updated December 10, 2015. Accessed February 28, 2016.
12.
Madan  S.  Education program during residency to improve documentation.  Am J Med. 2014;127(1):e5. doi:10.1016/j.amjmed.2013.08.029.PubMedGoogle ScholarCrossref
13.
Silverman  E, Skinner  J.  Medicare upcoding and hospital ownership.  J Health Econ. 2004;23(2):369-389.PubMedGoogle ScholarCrossref
Original Investigation
June 2016

Effect of a Documentation Improvement Program for an Academic Otolaryngology Practice

Author Affiliations
  • 1Head and Neck Institute, Cleveland Clinic Foundation, Cleveland, Ohio
JAMA Otolaryngol Head Neck Surg. 2016;142(6):533-537. doi:10.1001/jamaoto.2016.0194
Abstract

Importance  Physicians recognize the value of accurate documentation to facilitate patient care, communication, and the distribution of professional fees. However, the association between inpatient documentation, hospital billing, and quality metrics is less clear.

Objectives  To identify areas of deficiency in inpatient documentation and to instruct health care professionals on how to improve the quality and accuracy of clinical records.

Design, Setting, and Participants  A single-arm pre-post study was conducted from January 1, 2013, to December 31, 2014, among 17 attending and 12 resident physicians treating 1188 patients at an academic medical center. Data from 1 year prior to the intervention were compared with data for 10 months following the intervention. All increases were analyzed as a percentage increase after the intervention relative to before the intervention.

Interventions  Areas for improvement were identified, and all physicians in the department received education on inpatient coding and documentation.

Main Outcomes and Measures  The capture rate for complications or comorbidities and major complications or comorbidities, the case mix index (the average diagnosis related group relative weight for a hospital or department), and severity of illness and risk of mortality scores.

Results  A total of 1188 inpatients were included in the analysis: 743 in the preintervention period and 445 in the postintervention period. Review of our documentation identified major areas of comorbidity that were frequently underreported. Inadequate nutrition diagnoses (moderate malnutrition, severe protein-calorie malnutrition) were most often underreported. In addition, we found inadequate documentation supporting the presence of neck metastases. Among 1188 patients, the case mix index increased 5.3% (from 2.81 to 2.96) after the intervention, but this was not a statistically significant difference (P = .21). The normalized case mix index increased 21.7% (from 37.3 to 45.4; P < .01). The percentage of patients with a documented complication or comorbidity or major complication or comorbidity increased 27.1% (from 50.2% to 63.8%; P < .01). The percentage of patients assigned a severity of illness score of 3 or 4 increased 24.3% (from 34.7% to 43.0%; P < .01). The percentage of patients assigned a risk of mortality score of 3 or 4 increased 32.1% (from 18.7% to 24.7%; P = .01).

Conclusions and Relevance  After educational sessions, multiple measures of patient acuity increased significantly owing to improved documentation of common comorbid conditions. Although physicians intuitively appreciate the importance of good documentation, education on the technical aspects of coding can significantly improve the quality and accuracy of clinical records.

Introduction

Although clinically accurate documentation has always been critical to patient care, increased attention to quality and value-based care has made improvements in clinical documentation a rising priority for physicians and hospitals. Studies have demonstrated that physicians have a poor understanding of the technical aspects of documentation1 and how it affects measures of patient acuity,2 hospital quality metrics, and hospital reimbursement.1,2 This study focuses primarily on factors affecting Medicare Severity Diagnosis Related Group (MS-DRG) coding and All-Patient Refined Diagnosis Related Group (APR-DRG) coding.

Since 2008, Medicare’s Inpatient Prospective Payment System has used MS-DRG grouping to determine a hospital’s reimbursement for an inpatient encounter. Similar diagnoses and procedures are grouped under a single MS-DRG, which is then given a relative weight that reflects the “average relative-costliness of cases in that group.”3(p5) However, each primary MS-DRG is then subdivided based on the presence or absence of comorbid diagnoses and complications (CCs) and/or major comorbid diagnoses and complications (MCCs). Thus, for each base MS-DRG, relative weights within an MS-DRG doublet or triplet can be assigned, with higher relative weights reflecting the presence of diagnoses that include CCs or MCCs. The MS-DRG relative weight for each encounter is used by the Centers for Medicare & Medicaid Services to calculate hospital reimbursement. It is also used to assign a mean expected length of stay for each encounter. The case mix index (CMI) for a hospital or department is the average relative weight for that hospital or department’s inpatient encounters.

The APR-DRG system was developed by 3M Health Information Systems and is used by some non-Medicare payers. While the MS-DRG system assigns relative weights based on the presence of secondary diagnoses that include CCs or MCCs, the APR-DRG system assigns each patient a severity of illness (SOI) and risk of mortality (ROM) score based on qualifying secondary diagnoses. Both are scored on a scale of 1 to 4, with 1 corresponding to the lowest SOI or ROM and 4 the highest.

To generate administrative and billing records, coding professionals review a patient’s medical records to identify and code primary and secondary diagnoses and assign a DRG. However, coding staff must follow strict guidelines on how clinical data can be applied and are not allowed to interpret medical data. That is, while a diagnosis may seem abundantly clear based on the results of a laboratory or imaging study, it cannot be coded as a qualifying diagnosis unless it has been interpreted and the diagnosis has been explicitly documented (Table 1).

Multiple studies have confirmed that physicians tend to underdocument comorbid conditions.1,4,5 In addition, when pressed to improve documentation, physicians resist. Zalatimo et al state that a significant barrier to accurate documentation is a lack of interest from physicians and that physicians view coding guidelines as “clinically irrelevant and overly complex.”4(p757) Consequently, although a patient may have serious comorbidities, the diagnoses may not be reflected in his or her MS-DRG relative weight and SOI and ROM scores. Conversely, failure to accurately document comorbid conditions can lead to overcoding and overestimation of the severity of a patient’s diagnosis.6 Although these documentation failures may not necessarily affect the care a patient receives, they can influence the institution’s payment and quality metrics.

Our objective was to educate physicians within our department about the technical aspects of coding and billing, with the goal of improving documentation. The use of documentation improvement specialists to identify deficiencies and query health care professionals is not new. This effort is different, however, in that it was designed to identify deficiencies and proactively address them. In conjunction with colleagues across our institution, we developed and implemented a clinical documentation improvement program directed at identifying and correcting errors affecting MS-DRG and APR-DRG coding.

Methods
Opportunity Identification

The University HealthSystem Consortium is a voluntary collaboration of academic medical centers that have chosen to pool quality and accountability data. We used the CareFX tool (Harris Healthcare Solutions) to identify MS-DRG groupings related to otolaryngology in which our capture rate for CCs and MCCs was lower than the average reported by comparable institutions within the University HealthSystem Consortium. Because this was a quality assurance/improvement project, under the guidelines stipulated by the Institutional Review Board of the Cleveland Clinic Foundation, formal approval and participant consent were not requested.

After identifying MS-DRG groups in which our capture rate for CCs and MCCs was low compared with cohort institutions, medical records for patients admitted with the selected MS-DRGs were reviewed to identify undocumented or inadequately documented secondary diagnoses. The most common MS-DRGs for the preintervention and postintervention periods are listed in Table 2.

Education

After identifying initial areas of deficiency, residents and attending physicians were educated on the basics of the MS-DRG and APR-DRG systems and their relevance to billing and quality metrics. Educational sessions, led by a physician (resident or attending depending on the audience) and representatives from the coding and billing department, were conducted separately for attending and resident physicians. All health care professionals were educated on the specific inadequacies identified earlier, as well as strategies for avoiding them. Extended education was given to residents on how to report comorbidities and their management, given the level of their involvement in inpatient documentation. Two formal educational sessions for residents were held 6 months apart, in addition to brief informal educational sessions about specific improvement goals as needed.

Flash cards were made listing common comorbidities within our patient population that affect MS-DRG and APR-DRG coding. Specific phrases to describe interventions (eg, “assessed,” “monitored,” or “treated”) and necessary qualifying characteristics (eg, acute vs chronic, diastolic vs systolic) were also included (Figure).

In addition, resident leaders modified the department’s standard inpatient documentation templates to include technically appropriate terminology and prompt, accurate documentation. Specifically, selectable text was added to the assessment and plan section of the documentation template for adult patients—using the phrases and wording specified in the Figure—to facilitate accurate documentation of common comorbidities.

Monitoring and Analysis

Educational sessions were provided on February 11, 2014, establishing a cutoff point for the preintervention period (January 1, 2013, through February 11, 2014) and the posteducation period (February 12 through December 31, 2014). Ongoing monitoring was performed using the Documentation Analysis Website tool, which is internally developed software that automatically collates data relevant to accuracy of documentation. It collates patient demographics, admitting diagnosis or procedure, medical vs surgical DRG classification, and details about acuity of care, such as actual and expected length of stay, MS-DRG, SOI, and ROM.1 Since surgical DRGs have, on average, higher CMI scores than medical DRGs, a normalized CMI was used to cancel out the effects that a fluctuating census of medical vs surgical patients would have when analyzing the CMI for a cohort of physicians over time. Details regarding the normalized CMI and operation of the Documentation Analysis Website tool have been published separately.1

The tool was used to identify patients for whom the actual length of stay was significantly higher than the expected length of stay (which is derived from the MS-DRG). These medical records were then examined in detail to identify opportunities to improve documentation. These opportunities were discussed at bimonthly meetings. Residents and attending physicians were given information on areas of improvement opportunity as they were identified.

To assess the efficacy of our intervention, the capture rates of CCs and/or MCCs, SOI, and ROM before the intervention were compared with the same data for the period following the intervention. Primary outcomes tracked were the percentage of patients with a secondary diagnosis including a CC or MCC and the percentage of patients with a level 3 or 4 SOI or ROM score. Patients discharged under an MS-DRG with only 2 possible values (base vs CC or MCC) were treated as CC captures.

t Tests and χ2 tests were used to compare differences in the preintervention and postintervention periods. P < .05 was considered statistically significant. Statistical testing was performed using JMP (SAS Institute Inc).

Results

A total of 1188 inpatients were included in the analysis: 743 in the preintervention period and 445 in the postintervention period (approximate annual rate of 510 patients). Review of our documentation identified major areas of comorbidity that were frequently underreported. Inadequate nutrition diagnoses (moderate malnutrition, severe protein-calorie malnutrition) were most often underreported. In addition, we found inadequate documentation supporting the presence of neck metastases, which is considered a CC for most MS-DRGs. This finding was of particular importance, as it reinforced the fact that documentation that was adequate for physicians—listing TNM stage, for example—is not sufficient to justify a secondary diagnosis for coding purposes.

The CMI increased from 2.81 before the intervention to 2.96 after the intervention (5.3%; P = .21). The normalized CMI increased from 37.3 to 45.4 (21.7%; P < .01). The actual length of stay increased 0.7 days (12.9%), although this increase was not statistically significant (P = .14). The expected length of stay increased 0.5 days between periods (8.6%; P = .06).

Using χ2 analysis, the percentage of patients with a documented CC or MCC increased from 50.2% before the intervention to 63.8% after the intervention (27.1% relative increase; P < .01). The percentage of patients with a diagnosis including an MCC increased from 15.2% to 20.0% (31.5% relative increase; P = .03). The percentage of patients assigned an SOI score of 3 or 4 increased from 34.7% to 43.0% (24.3% relative increase; P < .01); the percentage of those assigned a score of 4 increased from 9.8% to 15.7% (59.7% relative increase; P < .01). The percentage of patients assigned an ROM score of 3 or 4 increased from 18.7% to 24.7% (32.1% relative increase; P = .01); the percentage of those assigned a score of 4 increased from 5.3% to 8.3% (58.2% relative increase; P = .04).

Discussion

After educational sessions on how to improve documentation, multiple measures of patient acuity increased significantly. However, the overall CMI did not change between periods. The lack of a statistically significant increase in CMI likely reflects the disparate mix of MS-DRG base weights in the preintervention and postintervention periods.1 Thus, even though capture of CCs and MCCs increased in the postintervention period, because the base weights in the preintervention and postintervention periods were different, the CMI did not change significantly.

To assess for changes directly related to improvement of documentation (as opposed to the changing mix of base relative weights), the normalized CMI was used.1 The normalized case weight for an individual encounter is calculated by dividing the difference between the actual and minimum MS-DRG weights by the difference between the maximum and minimum weights within an MS-DRG doublet or triplet. The mean of all normalized case weights is the normalized CMI, which normalizes all base encounters to a weight of 0 and all MCC encounters to a weight of 100. The MS-DRGs with only 1 relative weight regardless of documentation of CCs and MCCs (eg, MS-DRGs 3 and 4) were excluded, as no difference can be detected despite increased accuracy of documentation.

When the normalized CMI is examined, there is a significant increase between the preintervention and postintervention periods. This finding suggests that our intervention did improve the quality of our documentation.

Overall, we believe that our data reflect successful identification and intervention on areas of inadequate documentation. It is possible that the observed increase in acuity markers could reflect an underlying increase in the severity of our patient’s condition. However, the average length of stay in the postintervention period did not increase significantly. This finding suggests that the actual acuity of the patients’ condition—judged by time spent in the hospital, a previously reported proxy for patient severity1—did not change. The increase in measures of acuity without a corresponding increase in actual length of stay suggests that the increased acuity is owing to improved documentation and not a change in the complexity of the conditions among the patient population.

Our results are not an isolated finding. Resident education and use of documentation templates were shown to increase documentation of comorbid conditions, average APR-DRG severity scores, mortality risk, and the CMI in a prospective cohort of general surgery residents.7 Similar results were found with internal medicine residents for whom documentation templates increased the capture rate of CCs, MCCs, and expected mortality.8 Two recent series have shown the effectiveness of resident education, notecards, and ongoing monitoring in academic neurosurgery practices.1,4

The findings of this study have significant consequences for hospital billing and, perhaps more important, hospital quality reporting. With the rollout of value-based care and the increased emphasis on outcome-based payments, accurate documentation is necessary to ensure that hospitals are judged based on outcomes relative to underlying severity of illness. Failure to accurately document comorbid conditions will negatively affect hospital benchmarks, including the publicly reported mortality index.

Early in the DRG era, it was noted that billing data could easily be used to compare institutional quality and mortality.9 Today, derivatives of MS-DRG and APR-DRG data are used by US News & World Report to compare hospital quality,10 with the Center for Medicare & Medicaid Services’ publicly available hospital comparison tool using billing data to calculate and adjust parameters, such as expected mortality for patient acuity.11

Madan12 also points out that inadequate documentation of patient severity may disqualify a patient from inpatient admission status. Admission under observation status—or delayed conversion to full inpatient status—can increase the financial burden on individual patients.

While the possibility of maximizing documentation to enhance payments is real, our study is not advocating such a practice. While an effective documentation improvement program can increase margins per patient4 and hospital revenue, explicit focus on maximizing documentation to enhance payment risks charges of medical fraud8 should be avoided.2,13 The emphasis must be on improvement of medically accurate and technically sufficient documentation of patient comorbidities to reflect each patient’s acuity of care.

Fundamental to the success of our strategy was the up-front identification of high-value areas of inadequacy of documentation, which allowed for detailed education on a handful of correctable problems instead of general education on all possible areas for improvement. In addition, involvement and education of the residents providing day-to-day care was critical for success of this program. Convincing residents to participate fully in this program was achieved through education not only by coding specialists but also by trusted attending physicians (E.D.L. and R.R.L.). Education was directed toward the long-term implications on hospital and individual quality scores of inadequacies in documentation.

To ensure continued success of our education efforts, ongoing review of billing data using the Documentation Analysis Website tool is performed on a bimonthly basis. More important, because of resident turnover, we plan to repeat educational sessions on an annual basis as new residents join the program.

This study is limited by the single-institution, single-department nature of the intervention. In addition, we cannot definitively exclude confounding effects, such as an isolated increase in patient severity. Last, we cannot project whether this improvement will be sustained over time.

Conclusions

Although physicians intuitively appreciate the importance of good documentation, the details of technical coding are anything but intuitive. This study has implications not only for hospital reimbursement but also hospital quality and outcome measures. With an increased focus on value-based and outcome-based payments, developing and implementing a comprehensive program for improvement of documentation can substantially enhance capture of comorbidities, DRG accuracy, and measures of patient acuity.

Back to top
Article Information

Accepted for Publication: January 24, 2016.

Corresponding Author: Suhael R. Momin, MD, Head and Neck Institute, Cleveland Clinic Foundation, 9500 Euclid Ave, Desk A71, Cleveland, OH 44195 (momins@ccf.org).

Published Online: April 7, 2016. doi:10.1001/jamaoto.2016.0194.

Author Contributions: Dr Momin had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: All authors.

Acquisition, analysis, or interpretation of data: Momin.

Drafting of the manuscript: Momin, Lamarre.

Critical revision of the manuscript for important intellectual content: All authors.

Statistical analysis: Momin, Lorenz.

Administrative, technical, or material support: Lorenz, Lamarre.

Study supervision: Lorenz.

Conflict of Interest Disclosures: None reported.

Previous Presentation: This study was presented at the American Head & Neck Society 2015 Annual Meeting; April 23, 2015; Boston, Massachusetts.

Additional Contributions: Telena Owens, RN, RHIT, Health Information Management, Cleveland Clinic Foundation, assisted with acquisition, analysis, or interpretation of data; critical revision of the manuscript for important intellectual content; and administrative, technical, or material support. She was not compensated for her contribution.

References
1.
Rosenbaum  BP, Lorenz  RR, Luther  RB, Knowles-Ward  L, Kelly  DL, Weil  RJ.  Improving and measuring inpatient documentation of medical care within the MS-DRG system: education, monitoring, and normalized case mix index.  Perspect Health Inf Manag. 2014;11:1c.PubMedGoogle Scholar
2.
Mendez  CM, Harrington  DW, Christenson  P, Spellberg  B.  Impact of hospital variables on case mix index as a marker of disease severity.  Popul Health Manag. 2014;17(1):28-34.PubMedGoogle ScholarCrossref
3.
Centers for Medicare & Medicaid Services, Department of Health and Human Services. Acute care hospital inpatient prospective payment system: payment system fact sheet series. https://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNProducts/downloads/AcutePaymtSysfctsht.pdf. Published April 2013. Accessed February 28, 2016.
4.
Zalatimo  O, Ranasinghe  M, Harbaugh  RE, Iantosca  M.  Impact of improved documentation on an academic neurosurgical practice.  J Neurosurg. 2014;120(3):756-763.PubMedGoogle ScholarCrossref
5.
Koshy  S.  Documentation tips for pulmonary medicine: implications for the inpatient setting.  Chest. 2012;142(4):1035-1038.PubMedGoogle ScholarCrossref
6.
Hsia  DC, Krushat  WM, Fagan  AB, Tebbutt  JA, Kusserow  RP.  Accuracy of diagnostic coding for Medicare patients under the prospective-payment system.  N Engl J Med. 1988;318(6):352-355.PubMedGoogle ScholarCrossref
7.
Grogan  EL, Speroff  T, Deppen  SA,  et al.  Improving documentation of patient acuity level using a progress note template.  J Am Coll Surg. 2004;199(3):468-475.PubMedGoogle ScholarCrossref
8.
Spellberg  B, Harrington  D, Black  S, Sue  D, Stringer  W, Witt  M.  Capturing the diagnosis: an internal medicine education program to improve documentation.  Am J Med. 2013;126(8):739-743.e1. doi:10.1016/j.amjmed.2012.11.035.PubMedGoogle ScholarCrossref
9.
Zuidema  GD, Dans  PE, Dunlap  ED.  Documentation of care and prospective payment: one hospital’s experience.  Ann Surg. 1984;199(5):515-521.PubMedGoogle ScholarCrossref
10.
Olmsted MG, Geisen E, Murphy J, Bell D, Morley M. Methodology: U.S. News & World Report best hospitals 2014-15. http://www.usnews.com/pubfiles/BH_2014_Methodology_Report_Final_Jul14.pdf. Published July 14, 2014. Accessed February 28, 2016.
11.
Centers for Medicare & Medicaid Services. Hospital compare. https://www.medicare.gov/hospitalcompare/search.html. Updated December 10, 2015. Accessed February 28, 2016.
12.
Madan  S.  Education program during residency to improve documentation.  Am J Med. 2014;127(1):e5. doi:10.1016/j.amjmed.2013.08.029.PubMedGoogle ScholarCrossref
13.
Silverman  E, Skinner  J.  Medicare upcoding and hospital ownership.  J Health Econ. 2004;23(2):369-389.PubMedGoogle ScholarCrossref
×