[Skip to Content]
Access to paid content on this site is currently suspended due to excessive activity being detected from your IP address 54.205.87.3. Please contact the publisher to request reinstatement.
Sign In
Individual Sign In
Create an Account
Institutional Sign In
OpenAthens Shibboleth
[Skip to Content Landing]
Download PDF
Figure 1.
Institutional Review Board Review Process and Overall Review Times in Calendar Days
Institutional Review Board Review Process and Overall Review Times in Calendar Days

Rectangles indicate a process in which a particular person (or kind of person) accomplishes a task or series of tasks before the protocol advances to the next step in map; diamonds, a decision in which the actor must decide which path the protocol should follow; loops, a recursive series of processes within the overarching stream of the process. They are represented by multiple tiles superimposed and offset up and to the right. The topmost box identifies the actor who receives the protocol from the previous step and the actor who advances the protocol out of the loop to the subsequent step. The colors of the boxes identify the order of communication. Stacked rectangles indicate parallel processes that advance at the same time. Personnel are identified by the color key in the upper left hand corner that identifies the principal investigator (PI), research and development officer or committee (R&D), institutional review board (IRB) staff, IRB chair, IRB members, information security officer (ISO), privacy officer (PO), IRB committee, and associate chief of staff for research (ACOSR). Times are reported in median (range) days.

Figure 2.
Review Times of Research Protocols Submitted to the Institutional Review Board
Review Times of Research Protocols Submitted to the Institutional Review Board

Box and whisker plots depict the median and interquartile range of the cumulative number of days a protocol was under review or revision by each stakeholder. Whiskers extend up to 1.5 times the interquartile range. The total review time is the number of calendar days from the initial submission to the date of the research and development approval memo. Institutional review board time is the number of calendar days from the initial submission to the date of the approval memo. Median values are printed below each whisker plot. Outliers are identified by protocol. Numbers 1 through 6 identify unique protocols in each review type and correspond to the protocols in Table 4.

Table 1.  
Time Required to Review Newly Submitted Research Protocols
Time Required to Review Newly Submitted Research Protocols
Table 2.  
Differences in Review Times Across Types
Differences in Review Times Across Types
Table 3.  
Time of IRB Review Attributed to Each Stakeholder
Time of IRB Review Attributed to Each Stakeholder
Table 4.  
Reasons for Delay in Outlier Review Times
Reasons for Delay in Outlier Review Times
1.
Driscoll  A, Currey  J, Worrall-Carter  L, Stewart  S.  Ethical dilemmas of a large national multi-centre study in Australia: time for some consistency. J Clin Nurs. 2008;17(16):2212-2220.
PubMedArticle
2.
Dyrbye  LN, Thomas  MR, Mechaber  AJ,  et al.  Medical education research and IRB review: an analysis and comparison of the IRB review process at six institutions. Acad Med. 2007;82(7):654-660.
PubMedArticle
3.
Dziak  K, Anderson  R, Sevick  MA, Weisman  CS, Levine  DW, Scholle  SH.  Variations among Institutional Review Board reviews in a multisite health services research study. Health Serv Res. 2005;40(1):279-290.
PubMedArticle
4.
Goodyear-Smith  F, Lobb  B, Davies  G, Nachson  I, Seelau  SM.  International variation in ethics committee requirements: comparisons across five Westernised nations. BMC Med Ethics. 2002;3:E2.
PubMedArticle
5.
Larson  E, Bratts  T, Zwanziger  J, Stone  P.  A survey of IRB process in 68 U.S. hospitals. J Nurs Scholarsh. 2004;36(3):260-264.
PubMedArticle
6.
Pogorzelska  M, Stone  PW, Cohn  EG, Larson  E.  Changes in the institutional review board submission process for multicenter research over 6 years. Nurs Outlook. 2010;58(4):181-187.
PubMedArticle
7.
Porcu  L, Poli  D, Torri  V,  et al.  Impact of recent legislative bills regarding clinical research on Italian ethics committee activity. J Med Ethics. 2008;34(10):747-750.
PubMedArticle
8.
Byrne  MM, Speckman  J, Getz  K, Sugarman  J.  Variability in the costs of institutional review board oversight. Acad Med. 2006;81(8):708-712.
PubMedArticle
9.
Wagner  TH, Bhandari  A, Chadwick  GL, Nelson  DK.  The cost of operating institutional review boards (IRBs). Acad Med. 2003;78(6):638-644.
PubMedArticle
10.
Wagner  TH, Murray  C, Goldberg  J, Adler  JM, Abrams  J.  Costs and benefits of the National Cancer Institute central institutional review board. J Clin Oncol. 2010;28(4):662-666.
PubMedArticle
11.
Belknap  SM, Georgopoulos  CH, West  DP, Yarnold  PR, Kelly  WN.  Quality of methods for assessing and reporting serious adverse events in clinical trials of cancer drugs. Clin Pharmacol Ther. 2010;88(2):231-236.
PubMedArticle
12.
Dorr  DA, Burdon  R, West  DP,  et al.  Quality of reporting of serious adverse drug events to an institutional review board: a case study with the novel cancer agent, imatinib mesylate. Clin Cancer Res. 2009;15(11):3850-3855.
PubMedArticle
13.
Silberman  G, Kahn  KL.  Burdens on research imposed by institutional review boards: the state of the evidence and its implications for regulatory reform. Milbank Q. 2011;89(4):599-627.
PubMedArticle
14.
Govindarajan  R, Young  JW, Harless  CL, Hutchins  LF.  Barriers to clinical trials vary according to the type of trial and the institution. J Clin Oncol. 2007;25(12):1633-1634.
PubMedArticle
Original Investigation
Association of VA Surgeons
February 2015

Time Required for Institutional Review Board Review at One Veterans Affairs Medical Center

Author Affiliations
  • 1Center for Health Equity Research and Promotion, Veterans Affairs Pittsburgh Healthcare System, Pittsburgh, Pennsylvania
  • 2Department of Surgery, University of Pittsburgh, Pittsburgh, Pennsylvania
  • 3Department of Biostatistics, University of Pittsburgh, Pittsburgh, Pennsylvania
  • 4Department of General Internal Medicine, University of Pittsburgh, Pittsburgh, Pennsylvania
JAMA Surg. 2015;150(2):103-109. doi:10.1001/jamasurg.2014.956
Abstract

Importance  Despite growing concern that institutional review boards (IRBs) impose burdensome delays on research, little is known about the time required for IRB review across different types of research.

Objective  To measure the overall and incremental process times for IRB review as a process of quality improvement.

Design, Setting, and Participants  After developing a detailed process flowchart of the IRB review process, 2 analysts abstracted temporal data from the records pertaining to all 103 protocols newly submitted to the IRB at a large urban Veterans Affairs medical center from June 1, 2009, through May 31, 2011. Disagreements were reviewed with the principal investigator to reach consensus. We then compared the review times across review types using analysis of variance and post hoc Scheffé tests after achieving normally distributed data through logarithmic transformation.

Main Outcomes and Measures  Calendar days from initial submission to final approval of research protocols.

Results  Initial IRB review took 2 to 4 months, with expedited and exempt reviews requiring less time (median [range], 85 [23-631] and 82 [16-437] days, respectively) than full board reviews (median [range], 131 [64-296] days; P = .008). The median time required for credentialing of investigators was 1 day (range, 0-74 days), and review by the research and development committee took a median of 15 days (range, 0-184 days). There were no significant differences in credentialing or research and development times across review types (exempt, expedited, or full board). Of the extreme delays in IRB review, 80.0% were due to investigators’ slow responses to requested changes. There were no systematic delays attributable to the information security officer, privacy officer, or IRB chair.

Conclusions and Relevance  Measuring and analyzing review times is a critical first step in establishing a culture and process of continuous quality improvement among IRBs that govern research programs. The review times observed at this IRB are substantially longer than the 60-day target recommended by expert panels. The method described here could be applied to other IRBs to begin identifying and improving inefficiencies.

Created in 1974 by the Code of Federal Regulations, the existing network of more than 4000 institutional review boards (IRBs) is deliberately decentralized, yielding a wide variety of procedures and practices. For example, when the same multisite protocol is presented to multiple IRBs, data demonstrate wide variability in the time required for approval,17 number and nature of changes requested by the IRB,7 thoroughness and cost of the review,810 and quality of the IRB determinations.1,11,12 Despite this variety, little is known about the time required for IRB review.

A recent systematic review identified 25 empirical studies reporting some aspect of time required for IRB approval.13 The diversity of methods prevented formal meta-analysis, but the review summarizes data from 631 IRBs and 336 protocols in which IRB review times (either as median or mean) ranged from 13 to 116 days, with an average range of 220 days between the shortest and longest review reported in any given study. However, these data are limited in 3 important ways. First, most of the studies (18 of 25) report the implementation of a single multicenter research protocol across multiple IRBs, and it remains unclear if the reported times represent typical IRB performance. Second, although the 7 remaining studies report review times of a single IRB evaluating multiple protocols, these studies only report data regarding clinical trials for cancer treatment. It is therefore unclear how these data relate to the review times of other kinds of research protocols. Third, review times are limited to the total time required to secure IRB approval and, as such, it is not possible to identify the specific location of any delays in the review process. Institutional review board approval is a complex process involving multiple stakeholders, and without evidence about the specific source of delays, it is difficult to design initiatives for improving IRB efficiency.

To address these gaps in the literature, we designed this study to examine the review times for a single IRB across its entire portfolio of research. In addition, we sought to measure potential sources of delay in the review process.

Methods

We examined the independently operated IRB of a large urban Veterans Affairs (VA) medical center. This IRB oversees a diverse portfolio of 250 active studies, including clinical trials, patient and clinician education, secondary database analyses, health services research, and device development. The IRB reviews approximately 50 new protocols each year and is among the top 15 VA medical center IRBs in terms of workload.

To determine the time required for IRB review, we examined the IRB records for all protocols newly submitted to the IRB from June 1, 2009, through May 31, 2011. Although these records were not intended to track the time of review, they do document many key steps in the review process. For all protocols, we first identified the dates of initial submission and final approval. We then abstracted the dates for as many of the incremental steps in the review process as possible. We also recorded the type of review (exempt, expedited, or full board) along with characteristics of the protocol (industry sponsorship, observational methods, risk level, number of subjects, and single-site vs multisite recruitment).

The study’s principal investigator (PI) (D.E.H.) trained 2 analysts to abstract as much data as possible. Training included approximately 20 hours of orientation to the IRB review process and records and the goals of the coding effort. Guided by a detailed flowchart of the IRB review process (Figure 1), the analysts examined the IRB records to determine the dates for each recorded step in the IRB review process. Each protocol was coded independently by each analyst and then reviewed for agreement between the analysts. Disagreements were reviewed with the PI to reach consensus. After coding the first 20% of protocols, the analysts worked with the PI to develop a codebook to guide and standardize the remaining coding effort. The analysts rechecked the coding of the first 20% of protocols with the codebook, and after confirming consensus, they proceeded to code independently the next 20% of protocols. We then calculated κ statistics to assess agreement between analysts, with the plan for the analyst to single-code protocols after establishing agreement between the analysts of κ values greater than 0.75. If κ values were less than 0.75, disagreements were again reviewed by the analysts and the PI to reach consensus and, if necessary, the codebook was modified to accommodate any newly discovered difficulty or ambiguity in the coding process. This process was repeated for every 20% of the protocols to determine if coding could proceed independently.

Once coding was complete, the PI worked with the statisticians (B.H.H. and R.A.S.) to structure the data and clean them for errors. The final database was then analyzed to yield descriptive statistics about the overall time of IRB review as well as component steps along the process. Comparisons were made using analysis of variance and post hoc Scheffé tests after achieving normally distributed data through logarithmic transformation. Finally, the PI reviewed the detailed records for each protocol with outlier review times to ascertain the cause of the delay. (Outliers were defined as review times exceeding the third quartile of data by more than 1.5 times the interquartile range.)

All procedures were approved by the Research and Development (R&D) Committee of the VA Pittsburgh Healthcare System after the IRB determined that the procedures qualified for exemption from review.

Results

A total of 103 new research protocols were submitted to the IRB from June 1, 2009, through May 31, 2011. These included 12 protocols exempted from IRB review, 56 protocols approved under expedited procedures, and 35 protocols approved by the fully convened IRB committee. Of these, 46.1% were multisite protocols, 69.6% used methods determined to be minimal risk, 66.0% were observational studies, 26.3% were coordinated by the institution’s Clinical Trials Center, and 16.7% were industry-sponsored drug trials. Single-site studies aimed to enroll between 18 and 2000 participants (median, 70 participants). Multisite studies aimed to recruit 5 to 700 participants from the local site (median, 20 participants) toward a total sample across all sites of 80 to 5888 participants (median, 450 participants).

Agreement between coders was assessed with κ statistics after coding each quintile of data. Agreement was initially poor but improved with each quintile (for each quintile, the κ values were 0.01, 0.30, 0.73, 0.68, and 0.80, respectively). Because coders did not exceed the predetermined threshold of agreement (κ > 0.75) until the last quintile of data, all protocols were double coded by each analyst and all disagreements between the coders were reconciled through a consensus process with the PI.

The IRB review process described in Figure 1 consists of 3 basic steps. First, the protocol is scrutinized to ensure all personnel are duly credentialed to conduct research (steps 1-2). After all study personnel are credentialed, the IRB reviews the protocol for compliance with relevant ethical and regulatory criteria (steps 3-17). Finally, after IRB approval is granted, the R&D committee verifies the approvals received from all relevant committees, subcommittees, or other entities before granting final approval (steps 17-23).

The IRB review process begins with a preliminary administrative review (Figure 1, steps 3-4). For greater than minimal risk studies, IRB evaluation is then conducted by a fully convened IRB committee along with both the information security officer (ISO) and privacy officer (PO) (steps 5-9). Minimal risk studies as well as protocols contingently approved by the IRB committee are reviewed using expedited procedures in which the protocol is evaluated by the IRB chair, ISO, and PO, who each work concurrently and independently to evaluate the research protocol (steps 11-15). During this interval, the ISO, PO, and IRB chair each examine the protocol in parallel, requesting clarification and modification from the investigators as needed. Approval is granted only after all 3 reviewers have approved a single version of the research protocol.

Table 1 reports the times required to review protocols from initial submission to R&D approval. Total review times (Figure 1, steps 1-23) ranged from 16 to 631 calendar days, with the median time for full board review being the longest (131 days) and exempt review the shortest (82 days). As evidenced by the standard deviations reported in Table 1, there was wide variance in the data and several extreme outliers (Figure 2). Credentialing time (steps 1-2) was defined as the number of calendar days from initial submission to the date when the R&D administrator verified the research credentials of all listed personnel and ranged from 0 to 75 days (median, 1 day). The IRB review time (steps 3-17) was defined as the number of calendar days from initial submission to the date of the IRB approval memo and ranged from 5 to 570 days (median, 43 days for exempt protocols, 57 days for expedited protocols, and 103 days for full board protocols). Research and development review time (steps 17-23) was defined as the number of calendar days between the date of the IRB approval memo and the date of the R&D approval memo and ranged from 0 to 184 days (median, 15 days).

Comparison across all 3 review types demonstrated statistically significant differences for total and IRB review times (Table 2). Post hoc pairwise comparisons showed no significant difference between the total review times of exempt vs expedited reviews, but a significant difference between full board and expedited reviews. For IRB review times, post hoc pairwise comparisons showed no differences between exempt and expedited reviews but significant differences between full board review and both exempt and expedited reviews. We further compared the total review times according to each of the measured protocol characteristics (eg, industry sponsorship, risk level) and found that the only significant difference in review time was between expedited protocols using single-site and multisite designs in which single-site protocols took less time to review than multisite protocols (mean [SD], 77 [43] vs 195 [129] days, respectively; P < .001).

We then analyzed the exchanges between the IRB stakeholders to determine how many times a protocol was reviewed (or revised) by each stakeholder (chair, ISO, PO, or PI), how long the review (revision) took, and the proportion of the total IRB review time attributed to each stakeholder (eg, the sum of all the days a protocol is being reviewed by a specific stakeholder divided by the IRB review time required for that protocol). Although most of this time is attributed to steps 11 through 15 in Figure 1, it also includes the time spent by the IRB chair in step 4 and the PI in steps 5 and 9. We did not conduct statistical tests to formally analyze these data, but visual inspection of Table 3 shows that regardless of review type, protocols generally require 2 to 3 revisions before they are ready for IRB approval, and the review times of each stakeholder are roughly on par with each other, with median response times of 6 to 12 calendar days.

Finally, Table 4 describes the review process of each of the 10 protocols, with outlying review times noted in Figure 2. Of these 10 outlier protocols, 6 were delayed owing to the time it took the PI to respond to requested changes and 1 protocol was delayed by a slow review from the IRB chair. Two protocols had outlier review times because the protocol required a high number of revisions, and although no single stakeholder delayed any given review or revision, the total number of revisions summed to an outlier review time. The delay noted in the last protocol was due to a particularly long R&D review. The reasons for these delays remain unclear and would require qualitative review of the IRB records beyond the scope of the procedures currently approved by the R&D committee.

Discussion

This study provides robust estimates for the time required to secure approval from an IRB at a single VA medical center. Review times range from 3 to 4 months, with exempt and expedited reviews taking less time than full board reviews. However, it is unknown how these review times compare with other centers or any “gold standard” expectation for IRB review. A working group convened by the National Heart, Lung, and Blood Institute recognized the absence of concrete, evidence-based guidelines for IRB best practices, but nonetheless suggested IRBs should be able to review 90% of protocols within 60 days. The data summarized by Silberman and Kahn13 suggest review times on the order of 63 days are normative across the United States. However, regulations unique to the VA (eg, PO and ISO review) may add complexity that delays VA IRB review beyond the recommended 60 days.

The steps leading up to and after the IRB committee meeting (steps 5-9) and the R&D committee meeting (steps 17-23) were constrained by the deadline of the monthly meeting and thus took only 15 and 18.5 days, respectively. As such, examination of the incremental time required for each step of these subprocesses is not likely to reveal opportunities for substantial efficiencies. Instead, we focused our analyses on the steps with the greatest variability (steps 12-14), in which the greatest efficiencies might be achieved through system redesign.

Although R&D approval is required by specific VA regulations, it is not clear if the involvement of the fully convened R&D committee adds significant value to the review process, especially when its role is limited to verifying the necessary approvals of other committees. Rather than waiting for the next committee meeting, immediate improvement in review times might be achieved by empowering the R&D chair with executive authority to evaluate studies as they are approved.

Contrary to our expectation, based on anecdotal reports, that credentialing might be a source of significant delay, the median time for credentialing was only 1 day. However, our methods do not capture the time spent by investigators before submission when they are securing the credentials of their staff. The requirement for credentialing may still be a source of significant delay if the delay occurs before the date of initial submission of a protocol.

Also based on anecdotal reports, we had thought that the ISO, PO, or IRB chair might have been a source of significant and systematic delay. However, the data demonstrate the median review time for each of these stakeholders is between 7 and 11 days. There are outlier review times for each of these stakeholders, but no single stakeholder appears to be responsible for a systematic delay. Data from Table 2 might be used nonetheless to provide feedback to the chair, ISO, and PO regarding their performance, and by working with local administrators, it might be possible to establish goals to improve efficiency by reducing the median review time for each of these stakeholders as well as minimizing the substantial variation in each stakeholder’s time for review.

Finally, investigators usually responded promptly to the IRB’s requests for clarification or revision, taking between 7 and 12 days to make the requested revisions. However, 80.0% of the extreme delays in IRB review occurred between the IRB’s request for changes and the investigator’s response. The reasons for these delays were not clear. If the IRB requested major changes in the design or analysis of the study, those changes might legitimately require significant time to implement. On the other hand, if the IRB requested changes in wording or format, a long delay might reflect an overextended investigator unable to find time to perfect the protocol. Regardless of the reason, these data suggest that the worst delays may be avoidable if investigators invest the resources necessary to prepare prompt responses to the IRB.

Our findings are limited in several ways. First, our sample size of 103 limits the power of our conclusions, especially regarding exempt protocols, of which we sampled only 12. However, we did demonstrate statistically significant differences between the 3 review types for our primary outcomes of total and IRB review times. Furthermore, with the exception of 1 study examining 175 clinical trials of cancer medications,14 our sample is roughly 4 times the sample size of other reports of IRB review times; and, to our knowledge, unlike any other published report, our sample included all studies from a 2-year period, thus providing a more representative sample of the entire portfolio of research evaluated by the studied IRB. Second, although the IRB review process followed a standardized process, details of that process changed during our sampling frame, thus making data abstraction challenging and limiting inferences about the “typical” IRB review because the typical review was a moving target. Third, although we did measure some protocol-specific characteristics of each protocol (eg, industry sponsorship, observational methods, and risk level), we did not collect data about the identity or experience of the investigators and reviewers, quality of the initial submission, or other responsibilities of IRB stakeholders that may account for delays in IRB review. These and other similar factors likely play a significant role in the time required to review any given protocol, and future research might focus on qualitative analyses to identify protocol-specific factors associated with IRB review times. Finally, the coding effort was challengingly complex, and this may limit its application at other sites. Although we eventually achieved excellent intercoder reliability, it required co-coding 60 to 80 protocols.

Conclusions

Despite these limitations, these data permit us to counsel investigators at this site that IRB review will take approximately 2 to 3 months, and that the most important thing investigators can do to expedite the review is to respond quickly (within a week). The IRB may request changes that take substantial time to implement, but investigators can minimize the delay by making those changes as expeditiously as possible. In addition, these data can be used to inform a local process of quality improvement, and if review times such as these are tracked prospectively, the effect of process changes can be demonstrated. Finally, replicating similar methods at other sites within and outside the VA might develop more generalizable data. Sites with particularly fast review times might then be examined to determine the reasons for their efficiency, possibly establishing best practices that could be adopted by other IRBs.

Back to top
Article Information

Accepted for Publication: April 21, 2014.

Corresponding Author: Daniel E. Hall, MD, MDiv, MHSc, Center for Health Equity Research and Promotion, Veterans Affairs Pittsburgh Healthcare System, Bldg 30, University Drive (151C), Pittsburgh, PA 15240 (hallde@upmc.edu).

Published Online: December 10, 2014. doi:10.1001/jamasurg.2014.956.

Author Contributions: Drs Hall and Hanusa had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Hall, Ling, Arnold.

Acquisition, analysis, or interpretation of data: Hall, Hanusa, Stone.

Drafting of the manuscript: Hall, Hanusa, Ling, Arnold.

Critical revision of the manuscript for important intellectual content: Hall, Hanusa, Stone, Ling.

Statistical analysis: Hall, Hanusa, Stone.

Obtained funding: Hall.

Administrative, technical, or material support: Hanusa, Arnold.

Conflict of Interest Disclosures: None reported.

Funding/Support: This research was supported by grants CDA 08-281 and SDR 11-399-1 from the US Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development, Health Services Research and Development (Dr Hall).

Role of the Funder/Sponsor: Other than competitive peer review of the grant supporting grant proposals, the sponsor had no direct role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Disclaimer: The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs or the US government.

Previous Presentation: This study was presented at the 38th Annual Surgical Symposium of the Association of VA Surgeons; April 6, 2014; New Haven, Connecticut.

Additional Contributions: The authors would like to recognize the contributions of Marcia Homer, RN, and Laura Horowitz, JD, Veterans Affairs Pittsburgh Healthcare System, for their diligent efforts in coding the protocols. We also thank Ulrike Feske, PhD, Veterans Affairs Pittsburgh Healthcare System, Charles Lidz, PhD, University of Massachusetts Medical School, and 2 anonymous reviewers for their helpful comments on the manuscript.

References
1.
Driscoll  A, Currey  J, Worrall-Carter  L, Stewart  S.  Ethical dilemmas of a large national multi-centre study in Australia: time for some consistency. J Clin Nurs. 2008;17(16):2212-2220.
PubMedArticle
2.
Dyrbye  LN, Thomas  MR, Mechaber  AJ,  et al.  Medical education research and IRB review: an analysis and comparison of the IRB review process at six institutions. Acad Med. 2007;82(7):654-660.
PubMedArticle
3.
Dziak  K, Anderson  R, Sevick  MA, Weisman  CS, Levine  DW, Scholle  SH.  Variations among Institutional Review Board reviews in a multisite health services research study. Health Serv Res. 2005;40(1):279-290.
PubMedArticle
4.
Goodyear-Smith  F, Lobb  B, Davies  G, Nachson  I, Seelau  SM.  International variation in ethics committee requirements: comparisons across five Westernised nations. BMC Med Ethics. 2002;3:E2.
PubMedArticle
5.
Larson  E, Bratts  T, Zwanziger  J, Stone  P.  A survey of IRB process in 68 U.S. hospitals. J Nurs Scholarsh. 2004;36(3):260-264.
PubMedArticle
6.
Pogorzelska  M, Stone  PW, Cohn  EG, Larson  E.  Changes in the institutional review board submission process for multicenter research over 6 years. Nurs Outlook. 2010;58(4):181-187.
PubMedArticle
7.
Porcu  L, Poli  D, Torri  V,  et al.  Impact of recent legislative bills regarding clinical research on Italian ethics committee activity. J Med Ethics. 2008;34(10):747-750.
PubMedArticle
8.
Byrne  MM, Speckman  J, Getz  K, Sugarman  J.  Variability in the costs of institutional review board oversight. Acad Med. 2006;81(8):708-712.
PubMedArticle
9.
Wagner  TH, Bhandari  A, Chadwick  GL, Nelson  DK.  The cost of operating institutional review boards (IRBs). Acad Med. 2003;78(6):638-644.
PubMedArticle
10.
Wagner  TH, Murray  C, Goldberg  J, Adler  JM, Abrams  J.  Costs and benefits of the National Cancer Institute central institutional review board. J Clin Oncol. 2010;28(4):662-666.
PubMedArticle
11.
Belknap  SM, Georgopoulos  CH, West  DP, Yarnold  PR, Kelly  WN.  Quality of methods for assessing and reporting serious adverse events in clinical trials of cancer drugs. Clin Pharmacol Ther. 2010;88(2):231-236.
PubMedArticle
12.
Dorr  DA, Burdon  R, West  DP,  et al.  Quality of reporting of serious adverse drug events to an institutional review board: a case study with the novel cancer agent, imatinib mesylate. Clin Cancer Res. 2009;15(11):3850-3855.
PubMedArticle
13.
Silberman  G, Kahn  KL.  Burdens on research imposed by institutional review boards: the state of the evidence and its implications for regulatory reform. Milbank Q. 2011;89(4):599-627.
PubMedArticle
14.
Govindarajan  R, Young  JW, Harless  CL, Hutchins  LF.  Barriers to clinical trials vary according to the type of trial and the institution. J Clin Oncol. 2007;25(12):1633-1634.
PubMedArticle
×