A recent AACR presentation and an “under review” manuscript by the Google AI Healthcare team presents a fascinating (if somewhat idiosyncratic) light microscope that incorporates built in real-time artificial intelligence capabilities. The authors developed deep learning algorithms to identify metastatic breast cancer in lymph nodes and prostate cancer in prostatectomy slides. As a pathologist views slides through this adapted microscope, the deep learning results are projected into the optics in near real time (see image from their manuscript). Essentially, areas of cancer are outlined in bold colors, drawing the eye to these areas. The authors report that: “Pathologists testing the device reported a seamless experience that provided immediately useful information.”
I must admit to some confusion about the intended use of this technology. It seems likely that the adoption of digital pathology would surpass any realistic need to develop this light microscope-based technology. The authors point to possible applications in small labs or developing countries. Regardless, it is a harbinger of exciting things to come in the word of digital pathology and deep learning.
This integration of an “assist device” into the light microscope seems analogous to the Society of Automotive Engineers’ taxonomy for the development of self-driving cars (see below) which ranges from no autonomy (0) to full automation (5). I would argue that most pathologists currently practice in a 0-1 range. Don’t we often ask one another: “would you like to drive?” Assist devices that elevate our rank from 0 to 1 may include ancillary tests like immunohistochemistry. The only pathology devices that actually replace some pathologist functions entirely may be emerging image analysis tools that quantitate Ki-67 and ER/PR IHC. The much-feared full automation (5), and consequent replacement of the pathologist, seems far down the road, but a very exciting journey!