Association of the Presence of Trainees With Outpatient Appointment Times in an Ophthalmology Clinic | Medical Education and Training | JAMA Ophthalmology | JAMA Network
[Skip to Content]
[Skip to Content Landing]
Figure.  Mean Length of Physician Appointment Times
Mean Length of Physician Appointment Times

A, Mean times for physician appointments without trainees in sessions without trainees (controls). Dotted line indicates grand mean appointment time (91.4 min). B, Difference in mean appointment time when trainees were present vs not present. Dotted line indicates overall mean difference (15.9 additional min). Darker areas represent physicians who were faster with trainees present.

Table 1.  Characteristics of Faculty Physicians and Traineesa
Characteristics of Faculty Physicians and Traineesa
Table 2.  Characteristics of Study Appointments
Characteristics of Study Appointments
Table 3.  Mean Appointment Times by Trainee Group
Mean Appointment Times by Trainee Group
Table 4.  Model Effects of Trainees on Appointment Timea
Model Effects of Trainees on Appointment Timea
1.
Blumenthal  D, Collins  SR.  Health care coverage under the Affordable Care Act—a progress report.  N Engl J Med. 2014;371(3):275-281.PubMedGoogle ScholarCrossref
2.
Hu  P, Reuben  DB.  Effects of managed care on the length of time that elderly patients spend with physicians during ambulatory visits: National Ambulatory Medical Care Survey.  Med Care. 2002;40(7):606-613.PubMedGoogle ScholarCrossref
3.
Blumenthal  D, Abrams  MK.  Tailoring complex care management for high-need, high-cost patients.  JAMA. 2016;316(16):1657-1658.PubMedGoogle ScholarCrossref
4.
Chiang  MF, Boland  MV, Margolis  JW, Lum  F, Abramoff  MD, Hildebrand  PL; American Academy of Ophthalmology Medical Information Technology Committee.  Adoption and perceptions of electronic health record systems by ophthalmologists: an American Academy of Ophthalmology survey.  Ophthalmology. 2008;115(9):1591-1597.PubMedGoogle ScholarCrossref
5.
Boland  MV, Chiang  MF, Lim  MC,  et al; American Academy of Ophthalmology Medical Information Technology Committee.  Adoption of electronic health records and preparations for demonstrating meaningful use: an American Academy of Ophthalmology survey.  Ophthalmology. 2013;120(8):1702-1710.PubMedGoogle ScholarCrossref
6.
Babbott  S, Manwell  LB, Brown  R,  et al.  Electronic medical records and physician stress in primary care: results from the MEMO Study.  J Am Med Inform Assoc. 2014;21(e1):e100-e106.PubMedGoogle ScholarCrossref
7.
Chiang  MF, Read-Brown  S, Tu  DC,  et al. Evaluation of electronic health record implementation in ophthalmology at an academic medical center (an American Ophthalmological Society Thesis). Trans Am Ophthalmol Soc. 2013;111:70–92.
8.
Oberlander  J, Laugesen  MJ.  Leap of faith—Medicare’s new physician payment system.  N Engl J Med. 2015;373(13):1185-1187.PubMedGoogle ScholarCrossref
9.
Clough  JD, McClellan  M.  Implementing MACRA: implications for physicians and for physician leadership.  JAMA. 2016;315(22):2397-2398.PubMedGoogle ScholarCrossref
10.
Blumenthal  D, Abrams  M, Nuzum  R.  The Affordable Care Act at 5 years.  N Engl J Med. 2015;372(25):2451-2458.PubMedGoogle ScholarCrossref
11.
Bestvater  D, Dunn  EV, Nelson  W, Townsend  C.  The effects of learners on waiting times and patient satisfaction in an ambulatory teaching practice.  Fam Med. 1988;20(1):39-42.PubMedGoogle Scholar
12.
Williams  KA, Chambers  CG, Dada  M, Hough  D, Aron  R, Ulatowski  JA.  Using process analysis to assess the impact of medical education on the delivery of pain services: a natural experiment.  Anesthesiology. 2012;116(4):931-939.PubMedGoogle ScholarCrossref
13.
Gamble  JG, Lee  R.  Investigating whether education of residents in a group practice increases the length of the outpatient visit.  Acad Med. 1991;66(8):492-493.PubMedGoogle ScholarCrossref
14.
Hribar  MR, Read-Brown  S, Reznick  L,  et al Secondary use of EHR timestamp data: validation and application for workflow optimization. AMIA Annu Symp Proc. 2015;2015:1909-1917.
15.
Department of Health and Human Services. Centers for Medicare & Medicaid Services. Medicare Learning Network. Evaluation and management services. 2016. https://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNProducts/Downloads/eval-mgmt-serv-guide-ICN006764.pdf. Accessed February 16, 2017.
16.
West BT, Welch KB, Galecki AT.  Linear Mixed Models: A Practical Guide Using Statistical Software. 2nd ed. Ann Arbor, MI: CRC Press; 2014.
17.
R Core Team. R: A Language and Environment for Statistical Computing [Internet]. Vienna, Austria: R Foundation for Statistical Computing; 2014. http://www.R-project.org. Accessed October 3, 2016.
18.
Bates  D, Mächler  M, Ben Bolker  SW.  Fitting linear mixed-effects models using lme4.  J Stat Softw. 2015;67(1):1-48.Google ScholarCrossref
19.
Hothorn  T, Bretz  F, Westfall  P.  Simultaneous inference in general parametric models.  Biom J. 2008;50(3):346-363.PubMedGoogle ScholarCrossref
20.
ACGME Common Program Requirements [Internet]. 2017. http://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRs_2017-07-01.pdf. Accessed October 4, 2017.
21.
Irby  DM, Wilkerson  L.  Teaching when time is limited.  BMJ. 2008;336(7640):384-387.PubMedGoogle ScholarCrossref
22.
Williams  KA, Chambers  CG, Dada  M,  et al.  Applying JIT principles to resident education to reduce patient delays: a pilot study in an academic medical center pain clinic.  Pain Med. 2015;16(2):312-318.PubMedGoogle ScholarCrossref
23.
Young  T, Brailsford  S, Connell  C, Davies  R, Harper  P, Klein  JH.  Using industrial processes to improve patient care.  BMJ. 2004;328(7432):162-164.PubMedGoogle ScholarCrossref
24.
Xi  W, Dalal  V.  Impact of family medicine resident physicians on emergency department wait times and patients leaving without being seen.  CJEM. 2015;17(5):475-483.PubMedGoogle ScholarCrossref
25.
Lee  BW, Murakami  Y, Duncan  MT,  et al.  Patient-related and system-related barriers to glaucoma follow-up in a county hospital population.  Invest Ophthalmol Vis Sci. 2013;54(10):6542-6548.PubMedGoogle ScholarCrossref
26.
McMullen  M, Netland  PA.  Wait time as a driver of overall patient satisfaction in an ophthalmology clinic.  Clin Ophthalmol. 2013;7:1655-1660.PubMedGoogle Scholar
Original Investigation
January 2018

Association of the Presence of Trainees With Outpatient Appointment Times in an Ophthalmology Clinic

Author Affiliations
  • 1Department of Ophthalmology, Casey Eye Institute, Oregon Health & Science University, Portland
  • 2Department of Medical Informatics and Clinical Epidemiology, Oregon Health & Science University, Portland
JAMA Ophthalmol. 2018;136(1):20-26. doi:10.1001/jamaophthalmol.2017.4816
Key Points

Question  What is the association between the presence of trainees and appointment time in outpatient ophthalmology clinics?

Findings  In this single-center cohort study of 49 448 outpatient ophthalmology appointments by 33 attending physicians, appointments with residents and fellows were 32% and 30% longer, respectively, than appointments without trainees. The presence of a trainee in a clinic session was associated with longer mean appointment times, even in appointments in which the trainee was not present.

Meaning  Academic medical centers face potential challenges in maintaining clinical efficiency and medical education, particularly in emerging value-based reimbursement models.

Abstract

Importance  Physicians face pressure to improve clinical efficiency, particularly with electronic health record (EHR) adoption and gradual shifts toward value-based reimbursement models. These pressures are especially pronounced in academic medical centers, where delivery of care must be balanced with medical education. However, the association of the presence of trainees with clinical efficiency in outpatient ophthalmology clinics is not known.

Objective  To quantify the association of the presence of trainees (residents and fellows) and efficiency in an outpatient ophthalmology clinic.

Design, Setting, and Participants  This single-center cohort study was conducted from January 1 through December 31, 2014, at an academic department of ophthalmology. Participants included 49 448 patient appointments with 33 attending physicians and 40 trainees.

Exposures  Presence vs absence of trainees in an appointment or clinic session, as determined by review of the EHR audit log.

Main Outcomes and Measures  Patient appointment time, as determined by time stamps in the EHR clinical data warehouse. Linear mixed models were developed to analyze variability among clinicians and patients.

Results  Among the 33 study physicians (13 women [39%] and 20 men [61%]; median age, 44 years [interquartile range, 39-53 years]), appointments with trainees were significantly longer than appointments in clinic sessions without trainees (mean [SD], 105.0 [55.7] vs 80.3 [45.4] minutes; P < .001). The presence of a trainee in a clinic session was associated with longer mean appointment time, even in appointments for which the trainee was not present (mean [SD], 87.2 [49.2] vs 80.3 [45.4] minutes; P < .001). Among 33 study physicians, 3 (9%) had shorter mean appointment times when a trainee was present, 1 (3%) had no change, and 29 (88%) had longer mean appointment times when a trainee was present. Linear mixed models showed the presence of a resident was associated with a lengthening of appointment time of 17.0 minutes (95% CI, 15.6-18.5 minutes; P < .001), and the presence of a fellow was associated with a lengthening of appointment time of 13.5 minutes (95% CI, 12.3-14.8 minutes; P < .001).

Conclusions and Relevance  Presence of trainees was associated with longer appointment times, even for patients not seen by a trainee. Although numerous limitations to this study design might affect the interpretation of the findings, these results highlight a potential challenge of maintaining clinical efficiency in academic medical centers and raise questions about physician reimbursement models.

Introduction

Physicians are being pressured to treat more patients in less time owing to ongoing concerns about the cost and accessibility of health care.1-3 At the same time, physicians are concerned that the widespread adoption of electronic health records (EHRs) has adversely affected their clinical efficiency.4-6 For example, at Oregon Health & Science University (OHSU), Portland, which successfully adopted an institution-wide EHR in 2006, ophthalmologists saw 3% fewer patients after adoption and spent more than 40% more time per appointment.7 Furthermore, with a gradual shift toward value-based reimbursement models that measure the quality and cost of health care delivery, clinical efficiency is an increasing concern for physicians who work in outpatient settings.3,8-10

The challenge of improving clinical efficiency is particularly pronounced in academic medical centers, where delivery of care must be balanced with providing education. However, few studies have examined the association of graduate medical education with clinical efficiency and outpatient clinic workflow. Published studies have been limited in size and have had conflicting findings. For example, 2 studies11,12 found that the presence of trainees was associated with improved clinical efficiency by shortening patient wait times, whereas another study13 found that trainees were associated with lengthened patient wait times and overall appointment times. This issue has implications for clinical efficiency and physician reimbursement models, particularly in academic medical centers.

To address this gap in knowledge, this study examines the association of the presence of trainees with appointment time in outpatient ophthalmology clinics at an academic medical center. Using 1 year of data from 33 faculty physicians involved with 40 trainees, we examined the association of the presence of trainees with appointment times on a larger scale, to our knowledge, than other previous studies. Because ophthalmology is a high-volume outpatient specialty with medical and surgical elements, we believe it is a good study domain from which to draw conclusions that can generalize to other fields.

Methods
Study Setting

Casey Eye Institute is the Department of Ophthalmology at OHSU, an academic medical center in Portland, and includes more than 50 faculty clinicians who perform more than 115 000 outpatient examinations annually. The department provides primary eye care and serves as a major tertiary referral center in the Pacific Northwest and nationally. The department typically has 15 residents and 10 fellows per academic year. This study was approved by the institutional review board at OHSU, which waived the need for informed consent for this review of EHRs.

An institutionwide EHR system (EpicCare; Epic Systems) has been implemented throughout OHSU. This vendor develops software for midsize and large medical practices and is a market share leader among large hospitals. All ophthalmologists at OHSU have been using this EHR since 2006. All ambulatory practice management, clinical documentation, order entry, medication prescribing, and billing tasks are performed using components of the EHR.

Appointment Data Set

For this study, we used data from appointments with attending faculty physicians in ophthalmology from January 1 through December 31, 2014. We defined trainees as residents or clinical fellows in ophthalmology. We included appointments for only a set of faculty physicians who worked at OHSU for at least 6 months before and after the study period and had appointments with and without trainees. These inclusion criteria were intended to minimize bias from physicians with increasing or decreasing clinical practices or from those who worked exclusively with or without trainees. Physicians who did not use the EHR or who had fewer than 3 appointments with trainees during the study period were excluded. Demographics for study physicians and trainees (sex, age, and ophthalmic subspecialty) were gathered using publicly available data sources.

For each of the study physicians’ clinical appointments, we determined the length of appointment time, presence or absence of a trainee, and billing level of the appointment by querying OHSU’s clinical data warehouse.7 We calculated the length of appointment time by subtracting the encounter check-in time from the checkout time. We determined the trainee presence based on a method validated in a previous study using audit log entries14: a trainee was considered to be present during an appointment if they used the EHR for that patient for more than 2 minutes during that patient’s appointment. This cutoff was chosen to account for appointments in which trainees briefly checked the EHR for that patient but were not truly involved in the appointment. To validate this cutoff method, we examined 50 appointments and found that this method had 97% specificity and 100% sensitivity for identifying appointments in which trainees were involved compared with manual medical record review. Sessions with trainees were defined as clinic sessions in which at least 1 appointment had trainees present. Appointments were excluded if they were missing a check-in or checkout time, if multiple trainees were present, or if a nonphysician student used the EHR during the appointment.

To account for the possibility that trainees may tend to examine more complex patients on average in the study analysis, we assigned a billing level category to each appointment with the rationale that this category is a proxy for patient complexity.15 With input from the OHSU ophthalmology billing manager, we grouped billing codes into the following 3 levels: low (levels 1-2 office visits, preoperative and postoperative appointments, brief or intermediate cosmetic evaluations, vision examinations, and special procedures), medium (level 3 office visits, comprehensive cosmetic evaluations, intermediate or established comprehensive eye examinations, and refractive surgery consults), or high (levels 4-5 office visits, examinations that include treatment, and new comprehensive examinations).

Statistical Analysis

The mean (SD) lengths of appointment times with vs without trainees present were compared using the unpaired 2-tailed Welch t test. To address the possibility that summary statistics alone may be misleading in a diverse and longitudinal data set, we used linear mixed models to account for each physician’s and patient’s variability.16 In the model, appointment time was the response, presence of trainees was the fixed effect, and patients and physicians were the random effects. After confirming a significant interaction between presence of trainees and patient billing level with a type II Wald χ2 test (P < .001), the model was run a second time with patient billing level as a fixed interacting term to account for appointment complexity. P values were obtained through multiple comparisons and the Holm-Bonferroni method.

For unadjusted P values from the Welch t tests and adjusted P values from multiple comparisons, significance was defined as P < .05. All data processing and statistical calculations were conducted in R software (version 3.3.1),17 models were constructed using lme4 (version 1.1-12),18 and P values were calculated using the glht function from the multcomp package (version 1.4-6).19

Results
Appointment Data Set

Data regarding the 33 study physicians are shown in Table 1. The median age was 44 years (interquartile range [IQR], 39-53 years); 13 (39%) were women, and 20 (61%) were men. Twelve ophthalmic subspecialties were represented, with the largest numbers of physician specialties in glaucoma, pediatrics, and retina (4 each). During the study period, 20 residents and 20 fellows were present. The 33 study physicians had 60 117 appointments, of which 10 475 (17.4%) were excluded. Among the remaining 49 448 appointments that met inclusion criteria, 11 934 (24.1%) involved a trainee directly. Appointments included 14 554 high, 19 977 medium, and 14 917 low billing levels. These appointments occurred during 5861 half-day clinic sessions, 2311 (39.4%) of which were sessions with trainees (Table 2).

Association of the Presence of Trainees and Appointment Length

Sessions were divided into those with vs without trainees. Table 3 summarizes the mean length of appointment time based on the following 4 groups: (1) appointments without trainees (in sessions without trainees), (2) appointments without trainees (in sessions with trainees), (3) appointments with fellows, and (4) appointments with residents. The first group served as the control group for comparison.

Appointments with trainees were significantly longer than appointments in clinic sessions without trainees (mean [SD], 105.0 [55.7] vs 80.3 [45.4] minutes; P < .001). Across all billing levels, the mean appointment time was longer when a trainee was present (Table 3). The presence of a trainee in a clinic session was associated with a longer mean appointment time even in appointments for which the trainee was not present (mean [SD], 87.2 [49.2] vs 80.3 [45.4] minutes; P < .001).

Differences Among Physicians

Part A of the Figure shows the distribution of the 33 study physicians’ mean appointment times for the control group (appointments in sessions without trainees). Among all clinicians, the grand mean appointment time was 91.4 minutes, with an IQR of 74.1 to 102.6 minutes. Part B of the Figure presents the distribution of the difference between mean appointment lengths with vs without trainees for each physician. The mean appointment time without trainees was on average 15.9 minutes faster (IQR, 8.2-22.7 minutes). Three physicians (9%) had shorter mean appointment times when a trainee was present in the appointment, 1 (3%) had no change, and 29 (88%) had longer mean appointment times when a trainee was present.

Linear Mixed Model Analysis

To account for the variability among physicians, we used a linear mixed model to quantitatively represent the association of the presence of trainees with appointment time (Table 4). According to this model, the presence of a fellow was associated with a lengthening of appointment time of 13.5 minutes (95% CI, 12.3-14.8 minutes; P < .001), and the presence of a resident was associated with a lengthening of appointment time of 17.0 minutes (95% CI, 15.6-18.5 minutes; P < .001). Across all billing levels, the presence of trainees in an appointment was associated with a significant lengthening of mean appointment time, ranging from 7.6 minutes for low billing level appointments with fellows (95% CI, 5.5-9.7 minutes; P < .001) to 17.2 minutes for high billing level appointments with residents (95% CI, 15.0-19.3 minutes; P < .001).

We also considered the association of the presence of trainees with every appointment in a half-day clinic session even those without a trainee present. Although we found a difference in the mean times of appointments without trainees and the control appointments (Table 3), only medium billing level appointments without trainees were associated with lengthened appointment times in the linear mixed model (2.6 minutes; 95% CI, 1.1-4.0 minutes; P < .001). In contrast, appointments with low billing levels without trainees present were associated with a shortened appointment time (−1.9 minutes; 95% CI, −3.6 to −0.2 minutes; P = .03).

Discussion

In this study, we used EHR time stamps to conduct, to our knowledge, the first large-scale study quantifying the association of the presence of trainees with outpatient ophthalmology clinic efficiency. Three key study findings are (1) the presence of trainees during an appointment was associated with a significant lengthening of appointment time, (2) the association of trainees with appointment time varied among different physicians, and (3) the presence of trainees during a clinic session may be associated with longer appointment times even for appointments in which the trainee was not involved.

The first key finding is that presence of trainees was associated with longer appointment times. As demonstrated in Table 3, appointments with residents were 32% longer, and appointments with fellows were 30% longer compared with appointments without trainees (in clinic sessions without trainees). Perhaps unexpectedly, the association of fellows with appointment time was not considerably different than that of residents; however, this study was not designed to explain this difference.

One explanation for this increase in appointment time is that the presence of trainees adds a step to office workflow. We analyzed a reduced set of 5 study physicians in detail to examine how trainees were incorporated into clinical workflow (ie, as a replacement for clinical staff, such as technicians or nurses, vs as an additional step). Appointments involving a trainee and clinical staff (n = 2278) were significantly longer than those involving only clinical staff (n = 10 166) (98.8 vs 83.4 minutes; P < .001). In contrast, appointments involving only trainees without clinical staff were less common (n = 179) but were not significantly different in length (83.8 vs 83.4 minutes). Another possibility is that trainees’ appointments are longer because they see more complex patients to further their education. However, our analysis demonstrated that trainee involvement was associated with longer appointment times across all billing levels (Table 4). This finding suggests that lengthening of appointment times at the study institution is in part attributable to trainees adding steps to the clinic workflow, although another factor is that attending physicians take additional time to teach trainees. The Accreditation Council for Graduate Medical Education states that physician education must be “experiential” and “necessarily occurs within the context of the health care delivery system,”20 implying that the premise of graduate medical training is based on additional steps involving trainees and thereby longer patient appointment times.

Previous studies have examined the effect of trainees on outpatient clinics, but they are limited and smaller in scale. Two studies,11,13 published in 1988 and 1991, had conflicting results; both found that trainees were associated with increased consultation times but had differing results with respect to patient wait times. A more recent study12 used simulation models to study the effect of trainees on a private practice as the practice transformed into an academic medical center. The simulations predicted that using trainees would reduce the attending physician’s time with the patient, resulting in shorter overall appointment times, wait times, and clinic session lengths. However, no measured results were reported that supported that prediction. We believe that the large-scale retrospective data of the present study provide a more accurate picture of the association of trainees with length of appointment time, suggesting trainees are associated with decreased clinical efficiency.

The second key finding is that the association of trainees with appointment times varied among different physicians. Although most physicians’ appointments with trainees were longer than those without trainees, 3 physicians had shorter appointments with trainees (Figure, B). Additional regression analysis showed that these differences could not be explained by physician subspecialty or mean appointment length. This analysis suggests that how physicians use and teach trainees may change the association of trainees with appointment lengths. Many models are available for incorporating training in academic centers, and we would expect different models differ in efficiency.21 For example, 1 study22 showed that discussion of patient cases before the start of clinics resulted in shorter patient wait times, appointment times, and clinic session lengths. Our study was not designed to assess effectiveness of teaching, and future studies examining differences among physicians and educational models may be warranted.

The third key finding is that presence of trainees during a clinic session may be associated with longer times for all appointments even those in which the trainee was not involved. Appointments without trainees from sessions with trainees were a mean of 6.9 minutes longer than control appointments (Table 3). One explanation for this finding is the theory of constraints, by which systems are often limited by a single bottleneck that determines overall efficiency.23 As has been stated by others,22,24 the bottleneck in an academic clinic is the physician, who must examine or review every patient before the appointment is over. If trainees affect the speed of the physician’s examination because of educational activities or if trainees delay the time at which the physician can examine the patient, their presence will affect the system as a whole. Of note, linear mixed modeling of the association of trainees with all appointments in a session demonstrated that only medium billing level appointments without trainees in sessions with trainees were significantly longer (2.6 minutes) than control appointments, and low billing level appointments without trainees in sessions with trainees were significantly shorter than control appointments (−1.9 minutes). These differences might be explained by increased patient wait times owing to other long appointments with trainees, decreased patient wait times owing to prioritizing appointments without trainees, or decreased physician time in appointments without trainees to keep the clinic on time. Longer patient wait times have been shown to be associated with other aspects of patient care, such as patient satisfaction and likelihood of follow-up.25,26 Although future research is needed to thoroughly investigate these differences, our study suggests that most patients in a clinic session with trainees are affected regardless of whether they are seen by a trainee.

Limitations

Several study limitations should be noted. First, this study involved a single department at 1 academic institution. Ophthalmology is a high-volume outpatient specialty with surgical and medical components and shares fundamental characteristics with other academic outpatient clinics, such as presence of physicians, ancillary staff, and trainees and adherence to Accreditation Council for Graduate Medical Education requirements. Therefore, we believe that ophthalmology is a good study domain from which to draw broadly applicable conclusions. Second, this study did not account for potentially important factors, such as procedures scheduled in real time, differences among individual trainees, and the specific length of time trainees spent with attending faculty physicians. Similarly, this study did not examine potentially important measures, such as patient satisfaction and clinical outcomes. Accounting for these factors and measures was beyond the scope of this study, but these factors merit further study. Third, appointment time was calculated using checkout and check-in times, which may have been inaccurate owing to staff delays. However, we have no reason to believe that trainee appointments would have systematic differences in accuracy of EHR time stamps compared with nontrainee appointments. Fourth, this study used billing code as a proxy for appointment complexity, but this metric is not perfect. At our institution, most patients are scheduled directly with faculty physicians rather than trainees, but we do not know in advance which patients will be seen by trainees. Nonetheless, we believe that trainees may tend to see more complex patients on average (eg, residents saw fewer patients with low billing levels [Table 3]). Refining this proxy or investigating other measures of complexity for future studies may be desirable.

Conclusions

Overall, this study shows that the presence of trainees is associated with decreased clinical efficiency. This association has important implications for physicians at academic medical centers. Although academic medical centers receive graduate medical education payments from the Centers for Medicare & Medicaid Services, those payments are used to pay for residents rather than attending physicians, who are reimbursed at the same rate for outpatient examinations despite having additional educational responsibilities. Emerging value-based purchasing models,8,9 in which the cost of care is factored significantly into physician evaluation and reimbursement models, may create additional challenges for academic physicians based on these study findings. We hope that these study findings will stimulate future work involving the association of medical education with clinical workflow, approaches to maximize clinical efficiency while maintaining high-quality teaching, and policy making discussions about optimal ways to evaluate and reimburse physicians in academic medical centers.

Back to top
Article Information

Corresponding Author: Michael F. Chiang, MD, Department of Ophthalmology, Casey Eye Institute, Oregon Health & Science University, 3375 SW Terwilliger Blvd, Portland, OR 97239 (chiangm@ohsu.edu).

Accepted for Publication: September 23, 2017.

Published Online: November 9, 2017. doi:10.1001/jamaophthalmol.2017.4816

Author Contributions: Dr Chiang had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Hribar, Chiang.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: Goldstein, Hribar, Chiang.

Critical revision of the manuscript for important intellectual content: Hribar, Read-Brown, Chiang.

Statistical analysis: Goldstein, Hribar, Chiang.

Obtained funding: Hribar, Chiang.

Administrative, technical, or material support: Hribar, Read-Brown, Chiang.

Study supervision: Hribar, Chiang.

Conflict of Interest Disclosures: All authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest. Dr Chiang reports serving as an unpaid member of the scientific advisory board for Clarity Medical Systems and as a consultant for Novartis. No other disclosures were reported.

Funding/Support: This study was supported by grants T15 LM007088, K99 LM12238, and P30 EY010572 from the National Institutes of Health and by unrestricted departmental funding from Research to Prevent Blindness.

Role of the Funder/Sponsor: The funding sources had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

References
1.
Blumenthal  D, Collins  SR.  Health care coverage under the Affordable Care Act—a progress report.  N Engl J Med. 2014;371(3):275-281.PubMedGoogle ScholarCrossref
2.
Hu  P, Reuben  DB.  Effects of managed care on the length of time that elderly patients spend with physicians during ambulatory visits: National Ambulatory Medical Care Survey.  Med Care. 2002;40(7):606-613.PubMedGoogle ScholarCrossref
3.
Blumenthal  D, Abrams  MK.  Tailoring complex care management for high-need, high-cost patients.  JAMA. 2016;316(16):1657-1658.PubMedGoogle ScholarCrossref
4.
Chiang  MF, Boland  MV, Margolis  JW, Lum  F, Abramoff  MD, Hildebrand  PL; American Academy of Ophthalmology Medical Information Technology Committee.  Adoption and perceptions of electronic health record systems by ophthalmologists: an American Academy of Ophthalmology survey.  Ophthalmology. 2008;115(9):1591-1597.PubMedGoogle ScholarCrossref
5.
Boland  MV, Chiang  MF, Lim  MC,  et al; American Academy of Ophthalmology Medical Information Technology Committee.  Adoption of electronic health records and preparations for demonstrating meaningful use: an American Academy of Ophthalmology survey.  Ophthalmology. 2013;120(8):1702-1710.PubMedGoogle ScholarCrossref
6.
Babbott  S, Manwell  LB, Brown  R,  et al.  Electronic medical records and physician stress in primary care: results from the MEMO Study.  J Am Med Inform Assoc. 2014;21(e1):e100-e106.PubMedGoogle ScholarCrossref
7.
Chiang  MF, Read-Brown  S, Tu  DC,  et al. Evaluation of electronic health record implementation in ophthalmology at an academic medical center (an American Ophthalmological Society Thesis). Trans Am Ophthalmol Soc. 2013;111:70–92.
8.
Oberlander  J, Laugesen  MJ.  Leap of faith—Medicare’s new physician payment system.  N Engl J Med. 2015;373(13):1185-1187.PubMedGoogle ScholarCrossref
9.
Clough  JD, McClellan  M.  Implementing MACRA: implications for physicians and for physician leadership.  JAMA. 2016;315(22):2397-2398.PubMedGoogle ScholarCrossref
10.
Blumenthal  D, Abrams  M, Nuzum  R.  The Affordable Care Act at 5 years.  N Engl J Med. 2015;372(25):2451-2458.PubMedGoogle ScholarCrossref
11.
Bestvater  D, Dunn  EV, Nelson  W, Townsend  C.  The effects of learners on waiting times and patient satisfaction in an ambulatory teaching practice.  Fam Med. 1988;20(1):39-42.PubMedGoogle Scholar
12.
Williams  KA, Chambers  CG, Dada  M, Hough  D, Aron  R, Ulatowski  JA.  Using process analysis to assess the impact of medical education on the delivery of pain services: a natural experiment.  Anesthesiology. 2012;116(4):931-939.PubMedGoogle ScholarCrossref
13.
Gamble  JG, Lee  R.  Investigating whether education of residents in a group practice increases the length of the outpatient visit.  Acad Med. 1991;66(8):492-493.PubMedGoogle ScholarCrossref
14.
Hribar  MR, Read-Brown  S, Reznick  L,  et al Secondary use of EHR timestamp data: validation and application for workflow optimization. AMIA Annu Symp Proc. 2015;2015:1909-1917.
15.
Department of Health and Human Services. Centers for Medicare & Medicaid Services. Medicare Learning Network. Evaluation and management services. 2016. https://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNProducts/Downloads/eval-mgmt-serv-guide-ICN006764.pdf. Accessed February 16, 2017.
16.
West BT, Welch KB, Galecki AT.  Linear Mixed Models: A Practical Guide Using Statistical Software. 2nd ed. Ann Arbor, MI: CRC Press; 2014.
17.
R Core Team. R: A Language and Environment for Statistical Computing [Internet]. Vienna, Austria: R Foundation for Statistical Computing; 2014. http://www.R-project.org. Accessed October 3, 2016.
18.
Bates  D, Mächler  M, Ben Bolker  SW.  Fitting linear mixed-effects models using lme4.  J Stat Softw. 2015;67(1):1-48.Google ScholarCrossref
19.
Hothorn  T, Bretz  F, Westfall  P.  Simultaneous inference in general parametric models.  Biom J. 2008;50(3):346-363.PubMedGoogle ScholarCrossref
20.
ACGME Common Program Requirements [Internet]. 2017. http://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRs_2017-07-01.pdf. Accessed October 4, 2017.
21.
Irby  DM, Wilkerson  L.  Teaching when time is limited.  BMJ. 2008;336(7640):384-387.PubMedGoogle ScholarCrossref
22.
Williams  KA, Chambers  CG, Dada  M,  et al.  Applying JIT principles to resident education to reduce patient delays: a pilot study in an academic medical center pain clinic.  Pain Med. 2015;16(2):312-318.PubMedGoogle ScholarCrossref
23.
Young  T, Brailsford  S, Connell  C, Davies  R, Harper  P, Klein  JH.  Using industrial processes to improve patient care.  BMJ. 2004;328(7432):162-164.PubMedGoogle ScholarCrossref
24.
Xi  W, Dalal  V.  Impact of family medicine resident physicians on emergency department wait times and patients leaving without being seen.  CJEM. 2015;17(5):475-483.PubMedGoogle ScholarCrossref
25.
Lee  BW, Murakami  Y, Duncan  MT,  et al.  Patient-related and system-related barriers to glaucoma follow-up in a county hospital population.  Invest Ophthalmol Vis Sci. 2013;54(10):6542-6548.PubMedGoogle ScholarCrossref
26.
McMullen  M, Netland  PA.  Wait time as a driver of overall patient satisfaction in an ophthalmology clinic.  Clin Ophthalmol. 2013;7:1655-1660.PubMedGoogle Scholar
×