As the criterion standard for postmortem diagnosis, clinical autopsy has historically played a vital role in medical advancement—from identifying missed principal causes of death for which treatment likely would have improved survival at a consistent rate of approximately 10% of cases over the past half century1 to refining the definition of sudden cardiac death.2 Thus, autopsy provides crucial insight into both established and emerging diseases, rigorous means for quality assurance and outcomes research, and an important check on our collective premortem diagnostic hubris. However, an array of systems- and individual-level changes has seen autopsy increasingly relegated to relic status. On a systems level, autopsy has been financially disincentivized over the past 30 years, exemplified by the recent decision by the Centers for Medicare & Medicaid Services to eliminate the autopsy program requirement for hospitals to qualify for reimbursement.3 At the individual level, increasing confidence in advanced diagnostic tests has weakened the perceived value of autopsy and negatively biased autopsy discussions that are already uncomfortable for many clinicians.