[Skip to Navigation]
Sign In
Figure.  Report Review Algorithm
Report Review Algorithm

NLP indicates natural language processing.

Table 1.  Characteristics of the 290 141 Reports in Which the Natural Language Processing Algorithm Identified a Serious Injury or Death
Characteristics of the 290 141 Reports in Which the Natural Language Processing Algorithm Identified a Serious Injury or Death
Table 2.  Reports in the MAUDE Database in Categories Other Than Death in Which the Patient Had Died in All Reports Reviewed
Reports in the MAUDE Database in Categories Other Than Death in Which the Patient Had Died in All Reports Reviewed
Table 3.  Terms in the MAUDE Database Associated With the Highest Percentages of Missed Deaths
Terms in the MAUDE Database Associated With the Highest Percentages of Missed Deaths
1.
Dhruva  SS, Bero  LA, Redberg  RF.  Strength of study evidence examined by the FDA in premarket approval of cardiovascular devices.   JAMA. 2009;302(24):2679-2685. doi:10.1001/jama.2009.1899PubMedGoogle ScholarCrossref
2.
Brown  SL, Bright  RA, Tavris  DR.  Medical device epidemiology and surveillance: patient safety is the bottom line.   Expert Rev Med Devices. 2004;1(1):1-2. doi:10.1586/17434440.1.1.1PubMedGoogle ScholarCrossref
3.
Zheng  SY, Dhruva  SS, Redberg  RF.  Characteristics of clinical studies used for US Food and Drug Administration approval of high-risk medical device supplements.   JAMA. 2017;318(7):619-625. doi:10.1001/jama.2017.9414PubMedGoogle ScholarCrossref
4.
US Food and Drug Administration. Classify your medical device. Accessed April 16, 2021. https://www.fda.gov/medical-devices/overview-device-regulation/classify-your-medical-device
5.
Rajan  PV, Kramer  DB, Kesselheim  AS.  Medical device postapproval safety monitoring: where does the United States stand?   Circ Cardiovasc Qual Outcomes. 2015;8(1):124-131. doi:10.1161/CIRCOUTCOMES.114.001460PubMedGoogle ScholarCrossref
6.
Tau  N, Shepshelovich  D.  Assessment of data sources that support US Food and Drug Administration medical devices safety communications.   JAMA Intern Med. 2020;180(11):1420-1426. doi:10.1001/jamainternmed.2020.3514PubMedGoogle ScholarCrossref
7.
US Food and Drug Administration. MAUDE—Manufacturer and User Facility Device Experience. Accessed April 16, 2021. https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfmaude/search.cfm
8.
US Food and Drug Administration. CFR—Code of Federal Regulations Title 21. Accessed April 16, 2021. https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfCFR/CFRSearch.cfm?CFRPart=803
9.
US Food and Drug Administration. Mandatory reporting requirements: manufacturers, importers and device user facilities. Accessed April 16, 2021. https://www.fda.gov/medical-devices/postmarket-requirements-devices/mandatory-reporting-requirements-manufacturers-importers-and-device-user-facilities
10.
US Department of Health and Human Services. Adverse event reporting for medical devices. Accessed April 16, 2021. https://oig.hhs.gov/oei/reports/oei-01-08-00110.pdf
11.
US Food and Drug Administration. General Instructions—for Form FDA 3500A MedWatch (for Mandatory reporting). Accessed April 16, 2021. https://www.fda.gov/media/82655/download
12.
Meier  L, Wang  EY, Tomes  M, Redberg  RF.  Miscategorization of deaths in the US Food and Drug Administration adverse events database.   JAMA Intern Med. 2020;180(1):147-148. doi:10.1001/jamainternmed.2019.4030PubMedGoogle ScholarCrossref
13.
Device Events. Improve patient outcomes & reduce risk through better information. Accessed April 16, 2021. https://www.deviceevents.com
14.
Tomes  M.  Identification and market removal of risky medical devices.   JAMA Intern Med. 2020;180(11):1426-1427. doi:10.1001/jamainternmed.2020.3512PubMedGoogle ScholarCrossref
15.
Zhan  C, Baine  WB, Sedrakyan  A, Steiner  C.  Cardiac device implantation in the United States from 1997 through 2004: a population-based analysis.   J Gen Intern Med. 2008;23(suppl 1):13-19. doi:10.1007/s11606-007-0392-0PubMedGoogle ScholarCrossref
16.
Han  L, Ball  R, Pamer  CA, Altman  RB, Proestel  S.  Development of an automated assessment tool for MedWatch reports in the FDA adverse event reporting system.   J Am Med Inform Assoc. 2017;24(5):913-920. doi:10.1093/jamia/ocx022PubMedGoogle ScholarCrossref
17.
Meier  B, Thomas  K. Troubling flaws in a heart device shake implant makers. Accessed May 11, 2021. https://www.nytimes.com/2012/04/07/health/flaws-in-st-jude-heart-defibrillator-shake-the-industry.html
18.
Meier  B. Tests of heart devices to get review. Accessed May 11, 2021. https://www.heraldtribune.com/article/20071018/News/605237072
19.
Dick  K. The Bleeding Edge. Accessed May 24, 2021. https://www.netflix.com/title/80170862
21.
Tseng  ZH, Hayward  RM, Clark  NM,  et al.  Sudden death in patients with cardiac implantable electronic devices.   JAMA Intern Med. 2015;175(8):1342-1350. doi:10.1001/jamainternmed.2015.2641PubMedGoogle ScholarCrossref
Original Investigation
Health Care Policy and Law
July 26, 2021

Reporting of Death in US Food and Drug Administration Medical Device Adverse Event Reports in Categories Other Than Death

Author Affiliations
  • 1Department of Medicine, University of California, San Francisco
  • 2School of Arts and Sciences, University of Pennsylvania, Philadelphia, Pennsylvania
  • 3Device Events, York, Pennsylvania
  • 4Division of Cardiology, Department of Medicine, University of California, San Francisco
  • 5San Francisco Veterans Affairs Health Care System, San Francisco, California
  • 6Institute for Health Policy Studies, University of California, San Francisco
  • 7Teachable Moments Editor, JAMA Internal Medicine
  • 8Editor, JAMA Internal Medicine
JAMA Intern Med. 2021;181(9):1217-1223. doi:10.1001/jamainternmed.2021.3942
Key Points

Question  When a medical device adverse event report in the US Food and Drug Administration Manufacturer and User Facility Device Experience database describes a patient death, how often is that report classified as injury, malfunction, other, or missing instead of death?

Findings  In this study and manual review of a random sample of 1000 adverse event reports that were selected using a natural language processing algorithm to identify patient deaths, the percentage of reports classified as injury, malfunction, other, or missing (instead of death) was 23% (95% CI, 20%-25%).

Meaning  Many adverse event reports for medical devices that involve a patient death are classified in categories other than death, and as the US Food and Drug Administration only routinely reviews all adverse events that are reported as patient deaths, improving the accuracy of adverse event reporting may enhance patient safety.

Abstract

Importance  In the US, most postmarket medical device safety data are obtained through adverse event reports that are submitted to the US Food and Drug Administration (FDA)’s Manufacturer and User Facility Device Experience (MAUDE) database. Adverse event reports are classified by the reporter as injury, malfunction, death, or other. If the device may have caused or contributed to a death, or if the cause of death is unknown, the FDA requires that the adverse event be reported as a death.

Objective  To determine the percentage of medical device adverse event reports submitted to the MAUDE database that were not classified as death even though the patient died.

Design, Setting, and Participants  In this study, a natural language processing algorithm was applied to the MAUDE database, followed by manual text review, to identify reports in the injury, malfunction, other or missing categories that included at least 1 term that suggested a patient death, such as patient died or patient expired, from December 31, 1991, to April 30, 2020, for any medical device.

Exposures  Manual review of a random sample of 1000 adverse event reports not classified as death and of selected reports for 62 terms that are associated with deaths but were not classified as death.

Main Outcomes and Measures  Percentage of adverse event reports in which the patient was said to have died in the narrative section of the report but the reporter classified the report in a category other than death.

Results  The terms in the natural language processing algorithm identified 290 141 reports in which a serious injury or death was reported. Of these, 151 145 (52.1%) were classified by the reporter as death and 47.9% were classified as malfunction, injury, other, or missing. For the overall sample, the percentage of reports with deaths that were not classified as deaths was 23% (95% CI, 20%-25%), suggesting that approximately 31 552 reports in our sample had deaths that were classified in other categories. The overall percentage of missed deaths, defined as the percentage of deaths that were classified in other categories, was 17% (95% CI, 16%-19%).

Conclusions and Relevance  Many of the findings of this study suggest that many medical device adverse event reports in the FDA’s MAUDE database that involved a patient death are classified in categories other than death. As the FDA only routinely reviews all adverse events that are reported as patient deaths, improving the accuracy of adverse event reporting may enhance patient safety.

Introduction

Before marketing, the clinical trial evidence supporting medical device approval is often limited; postmarket surveillance provides additional evidence on safety and effectiveness.1-3 For the highest-risk (class III) medical devices, the US Food and Drug Administration (FDA) requires premarket approval and the submission of clinical data for the agency to review. However, premarket approval is often based on a single, small, nonrandomized study with a short period of follow-up.1 Examples of class III medical devices are ventricular assist bypass devices, deep brain stimulators, and coronary drug-eluting stents. Class II (usually moderate-risk) devices typically receive clearance based on substantial equivalence to an existing device; the submission of clinical data are not required.4 Thus, postmarket surveillance allows for longer-term follow-up of a larger and more diverse group of patients than those enrolled in premarket trials.2,5

Passive surveillance is the primary source of safety signals for medical device safety communications.6 Manufacturers, distributers, and health care facilities are required to submit adverse events to the FDA’s Manufacturer and User Facility Device Experience (MAUDE) database.6-9 Others (eg, physicians, hospitals, and patients) can submit reports, but such voluntary reports are submitted infrequently.10

The individuals or organizations reporting adverse events to MAUDE choose whether to report them as malfunction, injury, death, or other. The FDA requires that reports be classified as death if the device “may have caused or contributed to a death,” or if the cause of death is unknown.11 If a report in which a patient dies is not classified as death (eg, is reported instead as a device malfunction), the identification of safety signals may be delayed. Although the FDA must review all reports classified as death, it does not routinely review all reports that are classified as injury or malfunction.10

A prior report from our research group12 showed high rates of miscategorization of deaths as malfunctions or injuries for 2 high-risk cardiac devices, the Sapiens 3 aortic valve (Edwards Lifesciences) and the MitraClip (Abbott Vascular), a device used for transcatheter mitral valve repair. In this study, we expand on our earlier article by examining adverse event reports for all medical devices in MAUDE to determine how often a death was reported in a category other than death.

Methods
Data Source

We reviewed adverse event reports in MAUDE from its inception on December 31, 1991, to April 30, 2020. Given that MAUDE is a publicly accessible database that stores only deidentified information, this study was exempt from institutional review board approval. We used a proprietary software (Device Events) that exports MAUDE data into an interface that can be used to navigate the FDA database.13 The software uses a natural language processing algorithm to help identify reports in which there was a death. The natural language processing algorithm searches the free text in reports for terms such as patient died, patient expired, could not be resuscitated, and time of death, which are likely to be associated with death. The algorithm allows for plural and past tense versions of words (eg, death includes deaths and die includes died). The algorithm also excludes reports with selected phrases that are unlikely to be associated with death (eg, patient passed out should be excluded, although the term patient passed should not be excluded).

Contents of MAUDE Adverse Event Reports

The terms included in a MAUDE report include report ID, date of event, date of manufacturer receipt, date of report, date FDA received report, report type, report source, reporter occupation, company name, product code, device name, device brand name, and device generic name. Reporters must also complete an unstructured free text box to provide a more detailed event description. The report type refers to whether the adverse event reporter classified the patient event outcome as a death, malfunction, injury, other, or did not classify the report (missing). The report source describes whether a report was submitted by a user facility, manufacturer, voluntary reporter, or distributor. The Device Events software added a term for the device class (class I-III) associated with the report by matching the device product code with the appropriate class of the device.

Search Query

We included MAUDE reports with at least 1 of 70 terms that were associated with death. As our interest was in the reporting of deaths in categories other than death, we then limited the data set to reports in which the outcome was classified as injury, malfunction, other, or missing and manually reviewed selected reports.

Primary Outcome

The primary outcome was the percentage of reports in which the patient’s ultimate outcome was death, but the report had not been classified as death by the reporter. One of 2 authors (C.L. and E.M.K.) read the report descriptions to determine if a death occurred.

Statistical Analysis

We reviewed the event descriptions for a simple random sample of up to 50 injury, malfunction, other, and missing reports for each of the 70 terms for descriptions of a patient death (Figure). We selected all random samples by assigning a random number to each report in Microsoft Excel and selecting the reports with the 50 lowest assigned numbers. If there were fewer than 50 total reports for a given term, we reviewed all reports.

We calculated the percentage of reports in which a death was identified in the narrative among all reviewed reports (ie, all reports among those classified as injury, malfunction, other, or missing). If there were no reports in which the patient’s ultimate outcome was death, the search term was removed from analysis; 8 terms were removed for this reason, leaving 62 terms.

For each term, we determined the confidence interval for the percentage of these reports that were classified not as death but as injury/malfunction/other/missing despite a patient death in the full data set based on the percentage of death reports calculated from our manual review. We used the epitools package in the R statistical software (R Foundation) to calculate an exact binomial proportion confidence interval. We then determined the number of reports (in multiples of 50) to be reviewed to ensure a confidence interval width of less than 5% for the estimate for the true percentage of these death reports. We calculated 95% confidence intervals, which resulted in statistical significance for P < .05. This calculation was performed by assuming that the estimated percentage of reports in which a patient death occurred but the death was classified as injury/malfunction/other/missing instead of death in a random sample of 50 reports for a given term would generalize to the entire population of reports. For example, if all 50 of 50 manually reviewed reports (100%) for a given term were determined to have a patient outcome of death, the estimated percentage of such death reports was considered to be 100%, and approximately 100 reports would have to be reviewed to ensure a confidence interval at least as narrow as 95% to 100% for the true percentage. If the total number of reports that needed to be reviewed was greater than the total number of reports for a given term, all reports were reviewed, and an exact percentage was calculated. We reviewed all reports for 44 of 62 terms (71.0%). We estimated the total number of death reports that were classified in other categories for each of the 18 terms, for which we calculated confidence intervals by multiplying the estimated percentage of reports with deaths that were classified in other categories for the manually reviewed random sample of reports by the total number of reports that were classified as injury, malfunction, other, or missing.

To estimate the total number of reports of deaths that were classified in categories other than death, we also reviewed a random sample of 1000 reports from the full data set (138 996 reports in categories other than death). We counted the number of reports with death classified in other categories from this manually reviewed random sample of 1000 reports and applied that percentage to the overall sample (138 996 reports). We defined the percentage of missed deaths as the number of reports in which a patient’s outcome was classified as injury/malfunction/other/missing (ie, not as death), but in which the patient had died, divided by the total number of reports with the outcome of patient death.

Results
Sample Characteristics

The terms in the natural language processing algorithm identified 290 141 reports in which a serious injury or death was reported. Of these, 151 145 (52.1%) were classified by the reporter as death (Figure and Table 1). Most reports (162 147 [55.9%]) were submitted for class III devices; 116 358 (40.1%) were for class II devices. Of the reports, 50.1% were received by the FDA between January 2015 and April 2020. Nearly all reports were submitted by manufacturers (95.9%). There was more than 1 term associated with death, such as expired, could not be resuscitated, and pt passed in 52.8% of reports. There was no specified reporter occupation for 139 436 reports (48.1%); when specified, the most frequent reporter occupation was physician (58 787 [20.3%]), which likely refers to either physicians reporting adverse events to manufacturers or physicians employed by manufacturers. The most common product codes among all adverse event reports were ventricular (assist) bypass, with 38 708 reports; dialysate concentrate for hemodialysis (liquid or powder), with 25 261 reports; and transcervical contraceptive tubal occlusion device, with 14 387 reports.

Reports in Which the Patient Died but Which Were Not Classified as Death

For the overall sample, the percentage of reports with deaths that were not classified as deaths was 23% (95% CI, 20%-25%), suggesting that approximately 31 552 reports in our sample had deaths that were classified in other categories. Of the 70 natural language processing terms, 62 (88.6%) were associated with deaths that were categorized in other categories. We reviewed a total of 7951 reports for all terms. The reports for 44 (71.0%) of the terms were reviewed in full because of small sample sizes. For the 18 terms with sample sizes that were large enough to calculate confidence intervals, the term death(s) was associated with the highest total number of reports in which the patient died but had been classified as injury/malfunction/other/missing (instead of death). For the term death(s), the percentage of such reports not classified as deaths was 12% (95% CI, 9%-14%) based on our review of a random sample of 600 reports containing the term death(s), suggesting that approximately 13 156 reports with this term had deaths that were classified in other categories. Of the 62 terms, 17 (27.4%) had an estimated percentage of 100%, which means that every time that term was used, the patient had died, even though the reporter had not classified the report as death (Table 2).

Missed Deaths

In the random sample of 1000 reports that were not classified as death, the overall percentage of missed deaths, defined as the percentage of all deaths that were classified in other categories, was 17% (95% CI, 16%-19%). The term with the highest percentage of missed deaths among the 44 terms reviewed in full was unreported cause of death, with a total of 5 reports and a 60% missed death rate (Table 3). The term with the highest percentage of missed deaths among the 18 terms reviewed partially was patients died, which had a missed death percentage of 70% (95% CI, 69%-70%).

Discussion

We found that a large percentage of adverse event reports in the FDA’s MAUDE database that involved a patient death were classified in categories other than death by the reporter. The classification chosen by the reporter is vital, as the FDA must review all adverse events reported as deaths, which is not the case for the other reporting categories.10 Per FDA instructions, reports of a device malfunction or injury that include a patient death should be reported as death if the death may be associated with the device malfunction or if the cause of death is unknown.11 However, it is possible that manufacturers or other reporters do not classify the patient outcome as death if they do not believe that the death was definitively caused by the device.

The MAUDE database allows for the identification of safety signals for medical devices that are repeatedly associated with serious adverse events. Accurate reporting of patient deaths allows the FDA to pursue investigations to determine if there is cause for concern, notify clinicians, and, when appropriate, take regulatory actions to protect patient safety.14 Thus, it is essential that all deaths be classified and reported appropriately as deaths. There is a disclaimer on the MAUDE database that states, “Submission of a medical device report and the FDA's release of that information is not necessarily an admission that a product, user facility, importer, distributor, manufacturer, or medical personnel caused or contributed to the event.”7

Our findings suggest that the FDA could improve the identification of fatalities among patients with medical devices by using search terms, such as decedent and pt expired, within reports to help identify patterns and clusters of reports that are associated with higher rates of patient death. Using a natural language processing algorithm, as in our study, could help streamline the full review of reports that are more likely to be associated with patient death at a time when the numbers of medical devices and adverse event reports are substantially increasing.14,15 Future studies could evaluate other terms or consider using a machine learning algorithm (with the terms in our study or others) to allow for more rapid evaluation of reports to identify patient deaths, such as has been explored for drug adverse event reporting.16

Currently, manufacturers report, and thus choose, the reporting category for more than 95% of adverse event reports. There is an inherent conflict with manufacturers being the primary reporters, as it may not be in their interest to facilitate identification of serious problems with their own devices in a timely manner. There have been multiple instances of delays by manufacturers in reporting serious malfunctions and deaths that were associated with medical devices, as well as complete failures to report.17-19 As a result, it seems likely that more patients have been unknowingly treated with devices that turned out to be dangerous than would have been the case if reporting had not been delayed. Some examples of delayed reporting of adverse events associated with implantable cardioverter-defibrillator leads include delays by St Jude Medical17 and Medtronic.18 In another reporting failure, between 2002 and 2013, 32 000 women reported adverse events associated with Essure (Conceptus), a medical device used for permanent birth control, to the manufacturer, yet the FDA only received 1023 events (3.2%) during that period from the manufacturer.13,19 Bard received multiple warning letters following inspections in 2014 and 2015 that cited multiple failures, including a failure to comply with the medical device reporting regulations by misreporting patient deaths that were associated with vena caval filters as malfunctions.20 For these reasons, physicians, hospitals, and patients should submit reports directly to the FDA instead of or in addition to reporting through the manufacturer. The FDA could facilitate such reporting by making a freely available user-friendly mobile application that patients and physicians could use to report adverse events directly to the agency.

Limitations

Our study has limitations. We performed a manual review of a subset of adverse event reports. Specifically, we reviewed all of the reports for 44 of the 62 terms that were associated with death reports in categories other than death and used random samples with narrow confidence intervals (<5% confidence interval width) to create estimates of misclassified reports and missed deaths that were associated with the other 18 terms. As we only included reports that contained at least 1 term that was likely to be associated with a death and not all reports in the MAUDE database, our findings likely underestimate the actual number of deaths that were not classified as death, even when the patient died.

Conclusions

Many deaths in persons with medical devices are reported to the FDA as adverse events in categories other than death. Our findings suggest that the FDA’s adverse event reporting form should be revised to make it even clearer that if the cause of death in a patient with a medical device is unknown, as it often is,21 it still should be reported in the category of death. As the FDA only routinely reviews all adverse events that are reported as patient deaths, improving the accuracy of adverse event reporting may enhance patient safety.

Back to top
Article Information

Accepted for Publication: June 4, 2021.

Published Online: July 26, 2021. doi:10.1001/jamainternmed.2021.3942

Corresponding Author: Christina Lalani, MD, Department of Medicine, University of California, San Francisco, 505 Parnassus Ave, M1480, San Francisco, CA 94143 (christina.lalani@ucsf.edu).

Author Contributions: Dr Lalani had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Concept and design: Lalani, Redberg.

Acquisition, analysis, or interpretation of data: All authors.

Drafting of the manuscript: Lalani, Kunwar.

Critical revision of the manuscript for important intellectual content: Lalani, Kinard, Dhruva, Redberg.

Statistical analysis: Lalani, Kinard.

Obtained funding: Redberg.

Supervision: Redberg.

Conflict of Interest Disclosures: Dr Kinard reported grants from Arnold Ventures and being the chief executive officer of Device Events during the conduct of the study. Dr Dhruva reported grants from Arnold Ventures during the conduct of the study and research funding from the National Heart, Lung, and Blood Institute (NHLBI) of the National Institutes of Health, Medical Device Innovation Consortium as part of the National Evaluation System for Health Technology Coordinating Center, US Food and Drug Administration, and Greenwall Foundation. Dr Redberg reported grants from Arnold Ventures during the conduct of the study and grants from the Greenwall Foundation and NHLBI outside the submitted work. No other disclosures were reported.

Funding/Support: This research was supported by a grant from Arnold Ventures.

Role of the Funder/Sponsor: The funders had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Data Sharing Statement: The full list of search terms used in the study is available to bona fide researchers who wish to validate the findings by reasonable request to Device Events. The critical events thesaurus is protected as a trade secret.

Additional Contributions: We thank John Neuhaus, PhD, University of California, San Francisco, for his assistance with our statistical analysis, as well as William Vodra, JD, a retired partner from Arnold and Porter, and Michael Carome, MD, Public Citizen, for their feedback on an earlier version of the manuscript. These individuals agreed to be acknowledged; none were compensated.

Additional Information: Dr Neuhaus is a Statistical Editor, Dr Dhruva is a Teachable Moments Editor, and Dr Redberg is Editor of JAMA Internal Medicine. They were not involved in the editorial review of the manuscript or the decision to accept it for publication.

References
1.
Dhruva  SS, Bero  LA, Redberg  RF.  Strength of study evidence examined by the FDA in premarket approval of cardiovascular devices.   JAMA. 2009;302(24):2679-2685. doi:10.1001/jama.2009.1899PubMedGoogle ScholarCrossref
2.
Brown  SL, Bright  RA, Tavris  DR.  Medical device epidemiology and surveillance: patient safety is the bottom line.   Expert Rev Med Devices. 2004;1(1):1-2. doi:10.1586/17434440.1.1.1PubMedGoogle ScholarCrossref
3.
Zheng  SY, Dhruva  SS, Redberg  RF.  Characteristics of clinical studies used for US Food and Drug Administration approval of high-risk medical device supplements.   JAMA. 2017;318(7):619-625. doi:10.1001/jama.2017.9414PubMedGoogle ScholarCrossref
4.
US Food and Drug Administration. Classify your medical device. Accessed April 16, 2021. https://www.fda.gov/medical-devices/overview-device-regulation/classify-your-medical-device
5.
Rajan  PV, Kramer  DB, Kesselheim  AS.  Medical device postapproval safety monitoring: where does the United States stand?   Circ Cardiovasc Qual Outcomes. 2015;8(1):124-131. doi:10.1161/CIRCOUTCOMES.114.001460PubMedGoogle ScholarCrossref
6.
Tau  N, Shepshelovich  D.  Assessment of data sources that support US Food and Drug Administration medical devices safety communications.   JAMA Intern Med. 2020;180(11):1420-1426. doi:10.1001/jamainternmed.2020.3514PubMedGoogle ScholarCrossref
7.
US Food and Drug Administration. MAUDE—Manufacturer and User Facility Device Experience. Accessed April 16, 2021. https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfmaude/search.cfm
8.
US Food and Drug Administration. CFR—Code of Federal Regulations Title 21. Accessed April 16, 2021. https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfCFR/CFRSearch.cfm?CFRPart=803
9.
US Food and Drug Administration. Mandatory reporting requirements: manufacturers, importers and device user facilities. Accessed April 16, 2021. https://www.fda.gov/medical-devices/postmarket-requirements-devices/mandatory-reporting-requirements-manufacturers-importers-and-device-user-facilities
10.
US Department of Health and Human Services. Adverse event reporting for medical devices. Accessed April 16, 2021. https://oig.hhs.gov/oei/reports/oei-01-08-00110.pdf
11.
US Food and Drug Administration. General Instructions—for Form FDA 3500A MedWatch (for Mandatory reporting). Accessed April 16, 2021. https://www.fda.gov/media/82655/download
12.
Meier  L, Wang  EY, Tomes  M, Redberg  RF.  Miscategorization of deaths in the US Food and Drug Administration adverse events database.   JAMA Intern Med. 2020;180(1):147-148. doi:10.1001/jamainternmed.2019.4030PubMedGoogle ScholarCrossref
13.
Device Events. Improve patient outcomes & reduce risk through better information. Accessed April 16, 2021. https://www.deviceevents.com
14.
Tomes  M.  Identification and market removal of risky medical devices.   JAMA Intern Med. 2020;180(11):1426-1427. doi:10.1001/jamainternmed.2020.3512PubMedGoogle ScholarCrossref
15.
Zhan  C, Baine  WB, Sedrakyan  A, Steiner  C.  Cardiac device implantation in the United States from 1997 through 2004: a population-based analysis.   J Gen Intern Med. 2008;23(suppl 1):13-19. doi:10.1007/s11606-007-0392-0PubMedGoogle ScholarCrossref
16.
Han  L, Ball  R, Pamer  CA, Altman  RB, Proestel  S.  Development of an automated assessment tool for MedWatch reports in the FDA adverse event reporting system.   J Am Med Inform Assoc. 2017;24(5):913-920. doi:10.1093/jamia/ocx022PubMedGoogle ScholarCrossref
17.
Meier  B, Thomas  K. Troubling flaws in a heart device shake implant makers. Accessed May 11, 2021. https://www.nytimes.com/2012/04/07/health/flaws-in-st-jude-heart-defibrillator-shake-the-industry.html
18.
Meier  B. Tests of heart devices to get review. Accessed May 11, 2021. https://www.heraldtribune.com/article/20071018/News/605237072
19.
Dick  K. The Bleeding Edge. Accessed May 24, 2021. https://www.netflix.com/title/80170862
21.
Tseng  ZH, Hayward  RM, Clark  NM,  et al.  Sudden death in patients with cardiac implantable electronic devices.   JAMA Intern Med. 2015;175(8):1342-1350. doi:10.1001/jamainternmed.2015.2641PubMedGoogle ScholarCrossref
×