Surgical phase recognition (SPR) is a crucial element in the digital transformation of the modern operating theater. While SPR based on video sources is well-established, incorporation of interventional X-ray sequences has not yet been explored. This paper presents Pelphix, a first approach to SPR for X-ray-guided percutaneous pelvic fracture fixation, which models the procedure at four levels of granularity – corridor, activ- ity, view, and frame value – simulating the pelvic fracture fixation work- flow as a Markov process to provide fully annotated training data. Using added supervision from detection of bony corridors, tools, and anatomy, we learn image representations that are fed into a transformer model to regress surgical phases at the four granularity levels. Our approach demonstrates the feasibility of X-ray-based SPR, achieving an average accuracy of 93.8% on simulated sequences and 67.57% in cadaver across all granularity levels, with up to 88% accuracy for the target corridor in real data. This work constitutes the first step toward SPR for the X-ray domain, establishing an approach to categorizing phases in X-ray-guided surgery, simulating realistic image sequences to enable machine learning model development, and demonstrating that this approach is feasible for the analysis of real procedures. As X-ray-based SPR continues to ma- ture, it will benefit procedures in orthopedic surgery, angiography, and interventional radiology by equipping intelligent surgical systems with situational awareness in the operating room.
The simulated training and validation data can be downloaded here.
Download | Training Images | Val Images | Download Size |
---|---|---|---|
pelphix_000338 | 139,922 | 4,285 | 3.2 GB |
pelphix_000339 | 139,787 | 4,230 | 3.2 GB |
Total | 279,709 | 8,515 | 6.4 GB |
Sequences from our cadaveric experiments are available from the following links:
Download | Images | Download Size |
---|---|---|
liverpool | 256 | 1.2 GB |
Clone the repository:
git clone --recursive git@github.com:benjamindkilleen/pelphix.git
Install and activate the conda environment with
conda env create -f environment.yaml
conda activate pelphix
Individual experiments can be run by specifying the experiment
argument to main.py
. For example,
python main.py experiment={ssm,generate,pretrain,train,test} [options]
ssm
runs the statistical shape model to propagate annotations.generate
generates simulated datasets for sequences and view-invariant (totally random) sampling.pretrain
pre-trains the model on view-invariant data.train
trains the model on simulated sequences.test
tests the model on simulated sequences and cadaver data.
See conf/config.yaml for a full list of options. Common variations are:
ckpt=/path/to/last.ckpt
to load a model checkpoint for resuming training or running inference.gpus=n
to usen
GPUs.
If you found this work useful, please cite our paper:
@incollection{Killeen2023Pelphix,
author = {Killeen, Benjamin D. and Zhang, Han and Mangulabnan, Jan and Armand, Mehran and Taylor, Russell H. and Osgood, Greg and Unberath, Mathias},
title = {{Pelphix: Surgical Phase Recognition from X-Ray Images in Percutaneous Pelvic Fixation}},
booktitle = {{Medical Image Computing and Computer Assisted Intervention {\textendash} MICCAI 2023}},
journal = {SpringerLink},
pages = {133--143},
year = {2023},
month = oct,
isbn = {978-3-031-43996-4},
publisher = {Springer},
address = {Cham, Switzerland},
doi = {10.1007/978-3-031-43996-4_13}
}
If you use the simulated data, please also cite the NMDID database:
@misc{NMDID2020,
author = {Edgar, HJH and Daneshvari Berry, S and Moes, E and Adolphi, NL and Bridges, P and Nolte, KB},
title = {New Mexico Decedent Image Database},
year = {2020},
howpublished = {Office of the Medical Investigator, University of New Mexico},
doi = {10.25827/5s8c-n515},
}