Customize your JAMA Network experience by selecting one or more topics from the list below.
The architecture of the academic medical center can be viewed as a reflection of the contemporary relationship between physicians and scientists in the United States: the bustling hospital sits next to a high-rise of laboratories, and clinicians and researchers hurry between them. Yet the arrangements that bring physicians and scientists together may disguise the divergent historical and philosophical perspectives that separate them.
Hands-on laboratory instruction in medical training appeared at Harvard University in the 1880s and reached its most exemplary form at Johns Hopkins University in 1893 through the support of physician and educator William Welch.1 Trained in Germany, Welch argued that "nothing can replace the careful study of fresh specimens" for the student doctor. Abraham Flexner, surveying medical schools in 1910, concurred that medical students gained more through personal investigation than through didactic learning: the ideal medical student "no longer merely watches, listens, memorizes; he does."2 Early 20th-century physicians studied the approaches and techniques of the laboratory to apply them to problems arising in patient care. During this period, physicians, rather than professional scientists, formulated research questions in the clinic, used the lab to study them, and brought the results back to patients. Frederick Banting's research on insulin in the 1920s provides one example. After his surgical practice failed, Banting convinced a group of scientists and a medical student to join him in investigating a possible treatment for diabetes. The insulin extracted from canine pancreas in the summer was tested on hospital patients in the fall.3
Lab work also provided early 20th-century physicians with a prestigious professional identity. The physician who posed with a microscope or at a bench connected himself to the new techniques of microbiology and pharmacology and their anticipated benefits. Through devotion to scientific experimentation, physicians also differentiated themselves from the profit-driven businessman of the era.4
After World War II, as the nation's economic and social priorities changed, however, science became an independently funded and self-sustaining enterprise. In the 1950s the Office of Scientific Research and Development, the National Science Foundation, and the National Institutes of Health (NIH) expanded dramatically, providing new sources of funds for scientific research. At the same time, scientists gained new institutional positions as academic medical centers began to replace community hospitals as the site of medical training. Federal matching funds that were awarded to medical schools to build research facilities also came to support their educational infrastructure.5 By the 1960s, funded science research supported one third of all medical school expenditures.6
Although scientists and physicians began to work together in academic medical centers, the laboratory and the clinic emerged as arenas with distinct perspectives. Newly empowered scientists chose less clinically oriented research, and shared decision-making power with physicians on many projects. Cellular and biochemical characteristics of disease, not Welch's "fresh specimens," became prominent objects of investigation. After the 1950s, studies on treatment and symptomatology also declined, as did the use of clinical evidence like physical exams and patient histories.7 When the National Cancer Institute first opened in the 1930s, research into chemotherapeutics had faltered because physicians were concerned for patient safety. In the new academic medical centers, researchers moved beyond chemotherapies to molecular characterizations of viruses and oncogenes.
Ultimately, these developments served to widen the gap between the scientists who conducted experiments and the physicians who applied the results. Science had become an exacting and intricate method of investigation tied to the laboratory and its tools, a method that most physicians no longer learned first-hand. The science taught to medical students involved less laboratory experimentation and relied more on the study of other people's results. Physicians-in-training still studied techniques of DNA sequencing and configurations of protein channels, but had little experience in developing these tools themselves. The Medical Scientist Training Program (MSTP), created by the NIH in 1964 to provide support for medical students seeking an MD and a PhD in the basic sciences, institutionalized one route to becoming a medical researcher: laboratory training.8 While the program stressed the connection between medicine and science, its very inception attested to the distance that had developed between them. The MSTP reinforced the notion that clinical acumen was insufficient preparation for a career in research. Leaving the hospital rather than contending with its actualities became the preeminent method by which doctors could contribute to medical knowledge. Today, while MD-PhDs ensure that the path from the clinic to the lab is well-worn, other physician-researchers are moving beyond basic science methods. Physicians who have internalized Welch's inquisitive ethos pool clinical evidence in randomized controlled trials or conduct outcomes research to investigate medical interventions. In the next decades, the physician-researcher, emulating the MSTP, will tread more numerous and diverse paths from the clinic to the site of research.
Bromley E. The Evolving Relationship Between the Physician and the Scientist in the 20th Century. JAMA. 1999;281(1):95–96. doi:10.1001/jama.281.1.95
Coronavirus Resource Center
Create a personal account or sign in to: