Robotics from the bench: Research for ultrasound automation with augmented reality visualization

Authors

  • Felix von Haxthausen Universität zu Lübeck, Institut für Robotik und Kognitive Systeme
  • Sven Böttger Universität zu Lübeck, Institut für Robotik und Kognitive Systeme
  • Markus Kleemann Klinik für Chirurgie Universitätsklinikum Schleswig-Holstein-Campus Lübeck, Bereich Gefäß- und endovaskuläre Chirurgie
  • Floris Ernst Universität zu Lübeck, Institut für Robotik und Kognitive Systeme
  • Achim Schweikard Universität zu Lübeck, Institut für Robotik und Kognitive Systeme

Keywords:

robotic ultrasound, automated ultrasound, ultrasound navigation, augmented reality

Abstract

Summary: Ultrasound imaging commonly used for diagnostics may also be used for radiation-free catheter and needle navigation. Due to the considerable skill and expertise it requires, ultrasound image acquisition and diagnostics are difficult to be done. Automated ultrasound image acquisition could potentially overcome this operator dependency. For the automation and standardization of the diagnostic ultrasound imaging process as well as for the operator-free automated catheter and needle navigation, we are currently developing a medical robotic device platform.

A prototype of this robot-supported ultrasound platform was produced, on which various medical examination procedures can be developed. Three core technologies were applied therefor:

  • A force-sensitive 7-DoF robot arm (KUKA LBR iiwa), which can position both automatically and manually over a certain target area on the body surface. The arm has collision avoidance and ensures the dynamic force-controlled posture of the transducer to the patient.
  • 3D ultrasound is realized by a matrix probe and appropriate calculation algorithms on a modified ultrasound station (GE Vivid 7). It enables recording of large areas at high frame rates. Thus the target, its surrounding areas and nearby navigation points can be recorded simultaneously.
  • 3D ultrasound data streaming from ultrasound system to Microsoft HoloLens glasses and visualization at a certain distance from the ultrasound probe by tracking augmented reality markers.

The functionality was verified using an ultrasound phantom (BluePhantom FAST Trauma). The result is a 4D volume data set for the physician-assisted diagnostic evaluation. The process delivered reproducible real-time visualization results on a workstation where the volumes were simultaneously visualized and stored. Ultrasound volume data of the training model (matrix size 103x74x134) were sent from the ultrasound system to HoloLens in order to display them, showing a latency of 259 ± 86 ms.

Automated ultrasound diagnostics and navigation should help to reduce the binding of clinical resources in the future, thereby enabling better reproducibility of imaging and reducing side effects from radiation exposure. The device platform will serve as the basis for further automated ultrasound diagnostics and therapy procedures.

 

Published

2020-02-15

Issue

Section

What is to Come?