Corneal endothelial cell dysfunction is a common indication for penetrating keratoplasty.1 Significant postoperative complications such as high astigmatism, suture breakage or infection, and graft dehiscence have led to the development of alternative surgical approaches for endothelial replacement. In 1999, Melles et al2 reported a successful case of posterior lamellar keratoplasty in a human patient. This procedure was subsequently modified and renamed deep lamellar endothelial keratoplasty in 2001 by Terry and Ousley.3 Compared with penetrating keratoplasty, advantages of deep lamellar endothelial keratoplasty include reduced postoperative astigmatism with quicker visual recovery and a lower frequency of wound dehiscence and suture complications.4,5 However, this procedure was slow to gain widespread acceptance owing in part to its technically demanding nature. Melles et al6 subsequently developed a technique called descemetorhexis for stripping Descemet's membrane from the host cornea, thereby eliminating the difficult intralamellar dissection in the recipient and allowing more surgeons to adopt this procedure, which was named Descemet's stripping with endothelial keratoplasty.7 Another advantage of Descemet's stripping with endothelial keratoplasty over deep lamellar endothelial keratoplasty is a smoother recipient corneal interface, which may reduce light scatter and improve visual acuity outcomes. A further development in the evolution of endothelial keratoplasty includes the use of an automated microkeratome to create a lamellar dissection of the donor cornea to obtain the partial-thickness, posterior donor tissue. This variation has been termed Descemet’s-stripping automated endothelial keratoplasty (DSAEK) by Gorovoy.8