The proper use of information technology in health care remains widely debated. The notion that health information technology (HIT) can improve care is not new: since the 1960s, HIT has been examined as a tool to aid diagnosis and archive medical records. Half a century later, academics and policy makers continue to debate whether HIT tools such as electronic health records (EHRs) will save money or improve care.1 Advocates point to many studies demonstrating the benefits of EHRs.2 Critics argue that the best data come from a few leading institutions with locally developed systems, whereas data from commercial products in nonleading institutions are equivocal.2