Pars plana vitrectomy is almost a half-century old, and the earliest probes were powered by batteries, said Cynthia A. Toth, MD, the Joseph A.C. Wadsworth professor of ophthalmology at Duke University Medical Center. OCT imaging has “revolutionized our pre- and postoperative assessment,” she said, “but while our tools have advanced, our microscopes have not.”
Although OCT granted surgeons the ability to visualize anatomical abnormalities in the retina, surgeons returned to en face OCT-free views during surgery.
Dr. Toth noted there are “three staged developments in OCT – stage A was intraoperative OCT,” where a two-dimensional view outside the microscope gave the surgeon a pause in surgery to look up and review the image. In 2012 Bioptigen developed a handheld system for use in the OR, “which allowed us to image both before and after surgery, rather than waiting until the patient is seen during postop to determine if the surgical goals were achieved,” she said.
Stage B intraoperative OCT integrated the microscope to provide real-time, two-dimensional scans.
“Surgeons could view 3D volumes at a pause in surgery, or later,” Dr. Toth explained. “This is currently commercialized with several companies, including Bioptigen and CZM.”
But systems are not useful “unless they’re delivering information back to the surgeon,” she said. And that’s where Stage C– true intraoperative OCT – is now. The first of these systems is in the OR, and Dr. Toth believes a four-dimensional microscope-integrated OCT (MIOCT) “which is 3D volumes over time …is probably where we’re heading,” she said. “Clearly, we’re in the early stages of this technology.”The next stages will improve the speed and quality of image capture by displaying volumes faster, removing artifacts, and stabilizing systems.
“Intuitive surgeon control is going to be critical,” Dr. Toth stated. “Integrated heads-up display and true OCT-guided maneuvers are a research reality.”
Cynthia Toth, MD
Cynthia A. Toth, M.D. directed the Duke Biophysics Laboratory following Dr. Robert Machemer and later transitioned this into the Duke Advanced Research in Spectral Domain OCT Imaging Laboratory.
Cynthia Toth: Today I’ll be talking about the history of OCT imaging in surgery. It’s been 45 years, so 25 years since the development of OCT, but 45 years only since the invention of pars plana vitrecomy, and thus our field. That’s Bob Machemer in his workshop, Jean Marie Parel, and earlier someone talked about the 17 gauge instrument shoved into the eye, and yes, that’s an Eveready battery running the system. That was the early system. So pars plana vitreal retinal surgery required new tools. There’s a bent needle, and new operating microscope, and new techniques. Well, the tools have advanced, and at Duke we’ve worked a lot on surgical tool development up until the early 2000’s. But the microscope, no disrespect to those of microscope companies, but it appeared to lag behind. OCT imaging has revolutionized our pre- and post-operative assessment of patients. So the orange view, or as I call it, Helmholtz view, 1850’s, the idea of looking in with white light to view the fundus, and although we can identify that orange-red circle, the cross section from OCT is dramatically different and informs the surgeon both before and when we wait after surgery. But the surgeon goes back to the classical in face view during surgery. As you can see at the image below, that was a young child that I operated on, and although you can see the dramatic membrane I removed, I left a thin wisp behind that was significant and required another surgery. So stage A was intraoperative OCT. So I’m going to talk about three stages of development of intraoperative OCT. Stage A was 2D, outside the microscope and at a pause in surgery. And basically, Joe Izatt and I met in Jim Fujimoto’s lab at MIT. I was working for the Air Force, and Jim Fujimoto was funded by the Air Force for part of aspects of the development of OCT. Joe came to Duke in 2001 and I’ve been pleading with him for intraoperative, but at that time we only had time domain. Portable imaging in 2007, we had this research system from Bioptigen that we had taken under research to the OR, and the Bioptigen Invisu, as you’ve heard, 2012, hand held system available for use in the operating room to image both before and after surgery, rather than waiting till the patient is seen in postop to realize one has not achieved surgical goals. Stage B then was microscope integrated real time 2D scans. So this is viewing 3D volumes at a pause in surgery. And as you’ve seen earlier, one can visualize B scans during surgery. So again, a full volume would be maybe surgeon guidance, but a B scan at least gives us some of the information, not necessarily enough to guide full three dimensional movements. Suzanne Binder in Vienna was working with Karl Zeiss Meditech at the same time that Joe Izatt and my team were working on the development of intraoperative system. So stage B is now currently commercial. Several companies, including the Bioptigen Leica system that you see on the left, the EnFocus that was talked about earlier today, and the Karl Zeiss Meditech system, Justis Ehlers, who was a fellow with us at Duke earlier, is now at Cleveland Clinic working with Peter Kaiser and Sunil. And they’ve evaluated the rescan system and basically showing the advancement to a monocular heads up display. So remember, a system is not useful unless we can get feedback back to the surgeon. Surgeon driven need. And what I’m going to talk about is what we’ve been working on, which is where is OCT going. So we currently have the first stage of systems in the operating rooms, but 4D MIOCT, which is 3D volumes over time is probably where we’re heading. Need an integrated scanner. That’s what Kenny Towes developed. Oscar has developed a swept source OCT to link to this, GPU based imaging. So it’s really the throughput. It’s the path to computational rendering, which is using basically the gaming software, and they won the NVidia Global Award, Granton Keller and Christian Veeland. And then stereo heads up display. By combining this together, then the surgeon can integrate with this so that instead of looking down on the retina and having the B scan view that you see here, or the retina view, one turns into – and it’s more like Google street view. So if we come down now, the surgeon can actually identify forces interacting with the surface. And like with OCT in the clinic, surgeons will initially say, what do I need that for? Once surgeons start using it, and you realize what you can see with the interaction between instrument and retina, or instrument and beneath the retina, it becomes a different world. Clearly, we’re in the early stages of this technology. So the next stage of OCT is to improve speed and quality of imaging render, display volumes faster, remove artifacts, stabilize systems. Intuitive surgeon control is going to be critical: control of the view point, augmented reality, whether you see it as a 3D external screen, or as we’re working now with artificial reality, projecting it to the surgeons in an immersive environment. I just have to thank NIH because federal funding has really helped advance this project over the many years, and thank you again for the time to present.