Share Email Print
cover

Proceedings Paper

Trackerless surgical image-guided system design using an interactive extension of 3D Slicer
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Conventional optical tracking systems use cameras sensitive to near-infra-red (NIR) light detecting cameras and passively/actively NIR-illuminated markers to localize instrumentation and the patient in the operating room (OR) physical space. This technology is widely-used within the neurosurgical theatre and is a staple in the standard of care in craniotomy planning. To accomplish, planning is largely conducted at the time of the procedure with the patient in a fixed OR head presentation orientation. In the work presented herein, we propose a framework to achieve this in the OR that is free of conventional tracking technology, i.e. a trackerless approach. Briefly, we are investigating a collaborative extension of 3D slicer that combines surgical planning and craniotomy designation in a novel manner. While taking advantage of the well-developed 3D slicer platform, we implement advanced features to aid the neurosurgeon in planning the location of the anticipated craniotomy relative to the preoperatively imaged tumor in a physical-to-virtual setup, and then subsequently aid the true physical procedure by correlating that physical-to-virtual plan with a novel intraoperative MR-to-physical registered field-of-view display. These steps are done such that the craniotomy can be designated without use of a conventional optical tracking technology. To test this novel approach, an experienced neurosurgeon performed experiments on four different mock surgical cases using our module as well as the conventional procedure for comparison. The results suggest that our planning system provides a simple, cost-efficient, and reliable solution for surgical planning and delivery without the use of conventional tracking technologies. We hypothesize that the combination of this early-stage craniotomy planning and delivery approach, and our past developments in cortical surface registration and deformation tracking using stereo-pair data from the surgical microscope may provide a fundamental new realization of an integrated trackerless surgical guidance platform.

Paper Details

Date Published: 13 March 2018
PDF: 10 pages
Proc. SPIE 10576, Medical Imaging 2018: Image-Guided Procedures, Robotic Interventions, and Modeling, 105761F (13 March 2018); doi: 10.1117/12.2295229
Show Author Affiliations
Xiaochen Yang, Vanderbilt Univ. (United States)
Rohan Vijayan, Vanderbilt Univ. (United States)
Ma Luo, Vanderbilt Univ. (United States)
Logan W. Clements, Vanderbilt Univ. (United States)
Vanderbilt Institute for Surgery and Engineering (United States)
Reid C. Thompson, Vanderbilt Univ. Medical Ctr. (United States)
Vanderbilt Institute for Surgery and Engineering (United States)
Benoit M. Dawant, Vanderbilt Univ. (United States)
Vanderbilt Univ. Medical Ctr. (United States)
Vanderbilt Institute for Surgery and Engineering (United States)
Michael I. Miga, Vanderbilt Univ. (United States)
Vanderbilt Univ. Medical Ctr. (United States)
Vanderbilt Institute for Surgery and Engineering (United States)


Published in SPIE Proceedings Vol. 10576:
Medical Imaging 2018: Image-Guided Procedures, Robotic Interventions, and Modeling
Baowei Fei; Robert J. Webster III, Editor(s)

© SPIE. Terms of Use
Back to Top