Machine learning and coresets for automated real-time video segmentation of laparoscopic and robot-assisted surgery

© 2017 IEEE. Context-aware segmentation of laparoscopic and robot assisted surgical video has been shown to improve performance and perioperative workflow efficiency, and can be used for education and time-critical consultation. Modern pressures on productivity preclude manual video analysis, and ho...

Full description

Bibliographic Details
Main Authors: Volkov, Mikhail (Author), Hashimoto, Daniel A. (Author), Rosman, Guy (Author), Meireles, Ozanan R. (Author), Rus, Daniela L (Author)
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory (Contributor)
Format: Article
Language:English
Published: IEEE, 2022-01-04T15:58:30Z.
Subjects:
Online Access:Get fulltext
Description
Summary:© 2017 IEEE. Context-aware segmentation of laparoscopic and robot assisted surgical video has been shown to improve performance and perioperative workflow efficiency, and can be used for education and time-critical consultation. Modern pressures on productivity preclude manual video analysis, and hospital policies and legacy infrastructure are often prohibitive of recording and storing large amounts of data. In this paper we present a system that automatically generates a video segmentation of laparoscopic and robot-assisted procedures according to their underlying surgical phases using minimal computational resources, and low amounts of training data. Our system uses an SVM and HMM in combination with an augmented feature space that captures the variability of these video streams without requiring analysis of the nonrigid and variable environment. By using the data reduction capabilities of online k-segment coreset algorithms we can efficiently produce results of approximately equal quality, in realtime. We evaluate our system in cross-validation experiments and propose a blueprint for piloting such a system in a real operating room environment with minimal risk factors.