ISMRPhoto

Event Date: June 3, 2024 (In-Person)

Table of Contents

Overview

Attendees of this tutorial will learn how to use Gazebo (formerly known as Ignition), open-source dynamic simulation software, in combination with 3D Slicer, open-source medical image computing platform, to simulate image-guided robot-assisted interventions (IGRI).

Dynamic simulations have become an indispensable tool in developing mobile and industrial robotic systems. Gazebo supports a broad range of robots and sensors, such as GPS, IMU, camera and LIDAR, allowing the developers to simulate the interaction of the robot and environment before building a hardware system. However, the benefits to image-guided robot intervention have been limited due to the lack of integration with medical image computing environment, and the lack of support for the simulation of sensors typically used in medical robotics, particularly imaging scanners, such as ultrasound, MRI and CT.

Fortunately, those limitations have been diminished in recent years thanks to efforts to integrate medical imaging and robotics software environments. As demonstrated in our previous workshop at ISMR’23, a new Slicer plug-in, SlicerROS2, facilitates the integration of 3D Slicer with hardware and software supported by ROS2 and enables rapid prototyping of an IGRI system Gazebo, which has been extensively used in the ROS community, can be incorporated into such an IGRI system seamlessly. By facilitating the introduction of reliable dynamic simulation to the IGRI community, we hope to remove the barrier presented by purchasing, integrating and maintaining heterogeneous equipment and to accelerate the development cycle by removing time consuming calibration procedures.

Alongside ROS2, Gazebo (formerly called Ignition) introduces a new environment for dynamic simulations in which robots, sensors and environments can be simulated. Although Gazebo supports a broad range of robots, and sensors, these are mainly used in non-medical applications such as GPS, IMU, camera and LIDAR. Gazebo, however, supports sensor plugins, in which new sensors (and actuators) can be implemented and loaded into a simulated world. In this tutorial we will demonstrate how to extend the capability of Gazebo to simulate an ultrasound probe mounted on a robot. The user will load a patient model into a simulated world along with a robot-held ultrasound probe and another robot equipped with a biopsy needle. The images from the probe will be produced by a Gazebo ultrasound plugin and visualized in 3D Slicer by using SlicerROS2 (introduced at ISMR’23). The user will define a target position in 3D Slicer to control the second robot mounted with the needle to move to the target position.

The participants of this tutorial will experience the development of designing and building a Gazebo plugin to fill specific IGRI needs, configure the sensor parameters, load and run a simulation. Additionally, the communication between Gazebo and ROS2 will be presented and, finally, SlicerROS2 will be used for rendering the images in Slicer. Participants will also be exposed to the pitfalls of simulated environments by promoting modularity and preserving compatibility with real hardware.

Finally, the steps for translating from simulation to real hardware will be presented by presenting a small scale IGRI task that was previously debugged and tested in simulation. Although the emphasis of this tutorial will be on the simulation, modularity best-practices will be presented to facilitate the dissemination of packages among the community.

Keywords: Image-guided interventions, navigation, open-source software, software-hardware integration, dynamic simulation

Intended audience

Researchers, engineers, and students working in the field of medical robotics and image-guided interventions are welcomed to join. The tutorial would be particularly useful for those who are already engaged or will be engaged in the design, implementation, and clinical translation of a system for image-guided robot-assisted interventions. We strongly recommend the audience to bring their own laptop computers to follow the hands-on tutorial during the session. While the tutorial will not involve coding, some experience of running commands on a UNIX-like system and compiling open-source software using Make or CMake would be helpful.

Time Table

TO BE ANNOUNCED.

Links

References

Past Workshops

Organizers

Acknowledgements

This work is supported in part by:

  • U.S. National Institutes of Health (R01EB020667, R01EB020667-05S1, R01EB020610, P41EB028741)
  • Ontario Consortium for Adaptive Interventions in Radiation Oncology (OCAIRO)
  • SparKit project
  • CANARIE’s Research Software Program

NIBIB Logo

BWH HMS JHU UMD Queens

Contact

Junichi Tokuda, Ph.D.

Associate Professor of Radiology Brigham and Women’s Hospital / Harvard Medical School

tokuda at bwh.harvard.edu