Plenary Event
Thursday Morning Keynotes
23 February 2023 • 8:30 AM - 10:00 AM PST | Town & Country A
Session Chairs: Jeffrey H. Siewerdsen, The Univ. of Texas MD Anderson Cancer Ctr. (United States) and Christian Boehm, ETH Zurich (Switzerland)
8:30 AM:
Welcome and Introduction
8:35 AM:
Robert F. Wagner Award Finalists Announcements for Conferences 12466 and 12470
Image-Guided Procedures, Robotic Interventions, and Modeling Student Paper and Young Scientist Award Announcements
Awards Sponsored by:
8:40 AM
Seeing and feeling in robot-assisted surgery
Allison Okamura, Stanford Univ. (United States)
Haptic devices allow touch-based information transfer between humans and their environment. In minimally invasive surgery, a human teleoperator benefits from both visual and haptic feedback regarding the interaction forces between instruments and tissues. In this talk, I will discuss mechanisms for stable and effective haptic feedback, as well as how surgeons and autonomous systems can use visual feedback in lieu of haptic feedback. For haptic feedback, we focus on skin deformation feedback, which provides compelling information about instrument-tissue interactions with smaller actuators and larger stability margins compared to traditional kinesthetic feedback. For visual feedback, we evaluate the effect of training on human teleoperators’ ability to visually estimate forces through a telesurgical robot. In addition, we design and characterize multimodal deep learning-based methods to estimate interaction forces during tissue manipulation for both automated performance evaluation and delivery of haptics-based training stimuli. Finally, we describe the next generation of soft, flexible surgical instruments and the opportunities and challenges they present for seeing and feeling in robot-assisted surgery.
Allison Okamura received the BS degree from the University of California at Berkeley, and the MS and PhD degrees from Stanford University. She is the Richard W. Weiland Professor of Engineering at Stanford University in the mechanical engineering department, with a courtesy appointment in computer science. She is an IEEE Fellow and is currently the co-general chair of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems and a deputy director of the Wu Tsai Stanford Neurosciences Institute. Her awards include the IEEE Engineering in Medicine and Biology Society Technical Achievement Award, IEEE Robotics and Automation Society Distinguished Service Award, and Duca Family University Fellow in Undergraduate Education. Her academic interests include haptics, teleoperation, virtual reality, medical robotics, soft robotics, rehabilitation, and education.
This keynote is part of the Image-Guided Procedures, Robotic Interventions, and Modeling conference.
9:20 AM
Super-resolution ultrasound through localisation and tracking: technical developments and applications
Mengxing Tang, Imperial College London (United Kingdom)
There has been an inherent compromise between the spatial resolution and penetration depth in ultrasound imaging. Optical super-resolution, through e.g. photo-activated localization microscopy (PALM), has overcome the diffraction limit of spatial resolution, although its penetration is limited to in vitro applications. Super-resolution ultrasound (SRUS), particularly through localizing spatially isolated individual microbubbles, is shown to be able to break the wave diffraction limit and generate microscopic resolution at centi-meter depth, offering great promises for a wide range of clinical applications. In the talk I would introduce the principles of super-resolution ultrasound through localisation and tracking, and our efforts in technical developments to address some of the current challenges in SRUS and exploring its clinical applications.
Mengxing Tang is a professor and Chair in Biomedical Imaging at Imperial College London. His main research interest is in the field of biomedical ultrasound imaging. He has been one of the first to demonstrate ultrasound super-resolution imaging and very high frame rate contrast echocardiography and arterial flow quantification. He set up and is currently the director of the Ultrasound Laboratory for Imaging and Sensing (ULIS) at Imperial College. He has successfully graduated >20 PhD students, authored over 100 peer-reviewed SCI journal papers, and secured research funding of over 9 million pounds since his appointment at Imperial in 2006 from research councils, charities, and industry. He is an Associate Editor of IEEE T UFFC, A TPC member of the IEEE IUS, and the conference Technical Programme Co-Chair of the 2022 IEEE IUS Venice.
This keynote is part of the Ultrasonic Imaging and Tomography conference.
8:30 AM:
Welcome and Introduction
8:35 AM:
Robert F. Wagner Award Finalists Announcements for Conferences 12466 and 12470
Image-Guided Procedures, Robotic Interventions, and Modeling Student Paper and Young Scientist Award Announcements
Awards Sponsored by:


8:40 AM
Seeing and feeling in robot-assisted surgery

Allison Okamura, Stanford Univ. (United States)
Haptic devices allow touch-based information transfer between humans and their environment. In minimally invasive surgery, a human teleoperator benefits from both visual and haptic feedback regarding the interaction forces between instruments and tissues. In this talk, I will discuss mechanisms for stable and effective haptic feedback, as well as how surgeons and autonomous systems can use visual feedback in lieu of haptic feedback. For haptic feedback, we focus on skin deformation feedback, which provides compelling information about instrument-tissue interactions with smaller actuators and larger stability margins compared to traditional kinesthetic feedback. For visual feedback, we evaluate the effect of training on human teleoperators’ ability to visually estimate forces through a telesurgical robot. In addition, we design and characterize multimodal deep learning-based methods to estimate interaction forces during tissue manipulation for both automated performance evaluation and delivery of haptics-based training stimuli. Finally, we describe the next generation of soft, flexible surgical instruments and the opportunities and challenges they present for seeing and feeling in robot-assisted surgery.
Allison Okamura received the BS degree from the University of California at Berkeley, and the MS and PhD degrees from Stanford University. She is the Richard W. Weiland Professor of Engineering at Stanford University in the mechanical engineering department, with a courtesy appointment in computer science. She is an IEEE Fellow and is currently the co-general chair of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems and a deputy director of the Wu Tsai Stanford Neurosciences Institute. Her awards include the IEEE Engineering in Medicine and Biology Society Technical Achievement Award, IEEE Robotics and Automation Society Distinguished Service Award, and Duca Family University Fellow in Undergraduate Education. Her academic interests include haptics, teleoperation, virtual reality, medical robotics, soft robotics, rehabilitation, and education.
This keynote is part of the Image-Guided Procedures, Robotic Interventions, and Modeling conference.
9:20 AM
Super-resolution ultrasound through localisation and tracking: technical developments and applications

Mengxing Tang, Imperial College London (United Kingdom)
There has been an inherent compromise between the spatial resolution and penetration depth in ultrasound imaging. Optical super-resolution, through e.g. photo-activated localization microscopy (PALM), has overcome the diffraction limit of spatial resolution, although its penetration is limited to in vitro applications. Super-resolution ultrasound (SRUS), particularly through localizing spatially isolated individual microbubbles, is shown to be able to break the wave diffraction limit and generate microscopic resolution at centi-meter depth, offering great promises for a wide range of clinical applications. In the talk I would introduce the principles of super-resolution ultrasound through localisation and tracking, and our efforts in technical developments to address some of the current challenges in SRUS and exploring its clinical applications.
Mengxing Tang is a professor and Chair in Biomedical Imaging at Imperial College London. His main research interest is in the field of biomedical ultrasound imaging. He has been one of the first to demonstrate ultrasound super-resolution imaging and very high frame rate contrast echocardiography and arterial flow quantification. He set up and is currently the director of the Ultrasound Laboratory for Imaging and Sensing (ULIS) at Imperial College. He has successfully graduated >20 PhD students, authored over 100 peer-reviewed SCI journal papers, and secured research funding of over 9 million pounds since his appointment at Imperial in 2006 from research councils, charities, and industry. He is an Associate Editor of IEEE T UFFC, A TPC member of the IEEE IUS, and the conference Technical Programme Co-Chair of the 2022 IEEE IUS Venice.
This keynote is part of the Ultrasonic Imaging and Tomography conference.