Seeing further than before: This year’s BiOS Hot Topics are real eye-openers
Jennifer Barton, director of BIO5 at the University of Arizona and 2023 SPIE President-Elect, reflected on three ideas in advance of the BiOS symposium at Photonics West in January: the field of optoacoustics is becoming increasingly relevant and practical; optics is being integrated into real-time work, like performing surgery; and as optics tools provide new mechanisms of contrast, we are able to see things we could never see before.
Using that framework, Barton plunged in, with delight, to anticipating exactly what audiences would hear at the Hot Topics event. Her impromptu tour of the program included:
YongKeun Park of the Korea Advanced Institute of Science and Technology, “Quantitative Phase Imaging and Artificial Intelligence.” In his talk, Park connected phase imaging with AI. “And it’s not just about measuring the intensity of light passing through things, but also examining the time it takes light to go through things,” Barton said. “It involves the index of refraction, the speed of light moving through a certain medium.”
YongKeun Park. Credit: Korea Advanced Institute of Science and Technology
That illustrates the first two big ideas, Barton said, showing how new ventures let us see in new ways when we study structures that would otherwise be translucent.
Take a blood cell. “If you looked at it, it would be just a cell, transparent. But with this quantitative phase imaging, you can see not only all the different microstructures, but now you can see, in real time, how they are changing, how the shape is changing. You can watch that cell get infected, and see how all the structures inside the cell respond to that infection. And this is all without adding any fluorescent tags, or needing to have any green fluorescent protein (GFP) or any of that. This is really being able to see the cells in their natural environment and watch in real time how they respond to perturbations or disease threats,” Barton said. “This is great for just understanding basic science in ways that we haven’t been able to do before.”
Vasilis Ntziachristos of the Technical University of Munich's School of Medicine and the Institute of Biological and Medical Imaging, Helmholtz Zentrum, “Sound of Light: Optoacoustic vs. Optical Imaging.” In his talk, he compared optoacoustic vs. optical imaging.
Vasilis Ntziachristos, Technical University of Munich School of Medicine. Credit: Florian Peljak, Courtesy of Süddeutsche Zeitung.
“Both are useful,” Barton said, “but optoacoustics is a technique where you get to use optical selectivity and sensitivity. One of the great things about optical techniques is they are really, really sensitive. They can be sensitive to naturally occurring chromophores like hemoglobin or melanin. Or extraordinarily sensitive to a fluorescent dye that you put into a system. Or a nanoparticle.”
She added, “Optical techniques have great sensitivity, but limited depth of imaging.” Ultrasound, meanwhile, has very good depth and is scalable. “You can scale your resolution and depth of penetration depending on the frequency you choose. You can’t do that quite as much with optics.”
Optoacoustics — sometimes called photoacoustics — allows you to supply light, and see that light go through the tissue and get absorbed by a chromophore. It emits an ultrasound wave, and then you detect that wave. “So you get the depth of ultrasound,” Barton said, “with the sensitivity and contrast of optics.”
Chen Yang of Boston University, “Nongenetic Photoacoustic Neural Stimulation.” Yang reviewed her work in optoacoustic neural stimulation. “And this is super exciting stuff,” Barton said. “This is photoacoustics, but it is totally different. Instead of imaging, we are using light to generate an acoustic field that then trips a neuron.
Chen Yang of Boston University. Credit: Boston University
“The great part here is that, basically, you just use this little probe. I am an electrical engineer so I think back on the days when you got a printed circuit board and you just tap into them. We are just doing the same thing now with a brain slice. You just go in there and excite different neurons and see what else they activate within the brain slice, maybe 100 microns thick.”
Srirang Manohar of the University of Twente, “Sound Speed-Corrected Photoacoustic FullBreast Imaging.” Manohar’s work on photoacoustic breast imaging “is a case of using the same principles but getting much better information,” Barton said.
“Rather than assuming that the sound travels through the tissue at the same speed, we know that’s not true. And that’s where you get contrasts in ultrasound, from the different acoustic indexes.”
Manohar uses a technique where a researcher can actually estimate the speed of sound in the different pieces of tissue. “Obviously, that speed is going to be different in fat versus in muscle tissue. When you take that into consideration, you can get much better resolution and contrast in your images.”
Barton said current work on photoacoustics for breast imaging is something everybody agrees is promising and interesting but now it’s a matter of only “a couple of physicians out there being willing to purchase a product. But today it’s not going to be widely adopted. And then it eventually turns into something that will become standard of care.”
Caroline Boudoux of Polytechnique Montreal, “Photonic Lanterns: Shedding New Light on OCT.” Barton commented, “And the wonder woman comes up with another great idea. She is amazing.” Barton said she was just reading about Boudoux’s biomedicine topic — OCT, or optical coherence tomography, and Photonic Lanterns.
Caroline Boudoux of Polytechnique Montreal. Credit: Caroline Perron
“This is about getting more information out of tissue with new mechanisms of contrast. Seeing things you could not see before.” Boudoux’s innovations offer ways to capture more of the returning light that is backscattered in OCT, Barton said, adding, “Right now, most systems just let us look at the intensity and the timing of light that comes back. She’s looking now at the angular distribution of the light that comes back. And it turns out that you can be very sensitive to a certain structure by looking at its angular distribution. One example is neuritic plaques in Alzheimer’s disease.”
Kirill Larin of the Cullen College of Engineering at the University of Houston, “Optical Elastography: an Emerging Tool for Tissue Palpation.” Working in optical elastography. To illustrate elastography, Barton gave her arm a pinch and a poke. “That’s how you can learn about the tissue there,” she said with a laugh.
Kirill Larin of the Cullen College of Engineering at the University of Houston. Credit: University of Houston
“We do elastography all the time if we just poke things. That’s part of it. You may find that if you eat too many Christmas goodies, your fat jiggles a bit, or the muscles move. We have new ways to learn about the muscles, the mechanical properties of tissue. The awesome thing is, can you measure the mechanical properties of tissue without touching it? Just with light. That’s pretty exciting.”
Larin is interested in looking at how the eye’s lens hardens over time. “How do you know if therapies you are using are keeping the lens supple? So you don’t end up like me with these progressive glasses,” she said removing her own pair of glasses.
Eric Henderson of the Department of Orthopedics at the Geisel School of Medicine at Dartmouth University, “Early Identification of Life-Threatening Soft-Tissue Infection Using Dynamic Fluorescence imaging.”
Henderson has developed a way to use dynamic fluorescence imaging to track a radiology dye — indocyanine green, or ICG — through tissue. “If you put the dye into tissue and see how it spreads over time, you can figure out how to learn whether or not you’ve got, say, flesh-eating bacteria, or necrotizing fasciitis, in that tissue,” Barton said. “We can make a diagnosis that lets us know right away whether or not we’ve got a serious problem or something that’s going to clear up on its own.”
Jin Kang, professor of electrical and computer engineering at Johns Hopkins University, “Optical Image-Guided Autonomous Robotic Surgery.”
Jin Kang, professor of electrical and computer engineering at Johns Hopkins University. Credit: Johns Hopkins University
Focusing on Kang’s demonstrations of a robotic surgical robot, fully image guided, Barton said, “This is the most amazing idea: robotic surgery.” As Kang’s team works on a pig, the operation itself is fully autonomous.
“The robot is going by itself, rather than the physician using their eyes and their brain and feeding that back to their own hands. We are using optics to get even more information than you’d get with a physician’s eye.”
Information feeds back into the robot, telling it what to do, for very specific tasks. It forms a closed loop with optical imaging. “Where this is going to be really good is for procedures that require really precise repetitive motions. Robots can do that far better than people. No time soon are we going to have the robot do the entire surgery, but it works for certain things.”
Ford Burkhart is a science and technology writer based in the US. A version of this article appeared in the 2022 Photonics West Show Daily.
|Enjoy this article?
Get similar news in your inbox