#5554. Fully automated lumen and vessel contour segmentation in intravascular ultrasound datasets
August 2026 | publication date |
Proposal available till | 21-05-2025 |
4 total number of authors per manuscript | 0 $ |
The title of the journal is available only for the authors who have already paid for |
|
|
Journal’s subject area: |
Radiology, Nuclear Medicine and Imaging;
Health Informatics;
Computer Graphics and Computer-Aided Design;
Radiological and Ultrasound Technology;
Computer Vision and Pattern Recognition; |
Places in the authors’ list:
1 place - free (for sale)
2 place - free (for sale)
3 place - free (for sale)
4 place - free (for sale)
Abstract:
Segmentation of lumen and vessel contours in intravascular ultrasound (IVUS) pullbacks is an arduous and time-consuming task, which demands adequately trained human resources. In the present study, we propose a machine learning approach to automatically extract lumen and vessel boundaries from IVUS datasets. The proposed approach relies on the concatenation of a deep neural network to deliver a preliminary segmentation, followed by a Gaussian process (GP) regressor to construct the final lumen and vessel contours. A multi-frame convolutional neural network (MFCNN) exploits adjacency information present in longitudinally neighboring IVUS frames, while the GP regression method filters high-dimensional noise, delivering a consistent representation of the contours. Overall, 160 IVUS pullbacks (63 patients) from the IBIS-4 study (Integrated Biomarkers and Imaging Study-4, Trial NCT00962416), were used in the present work. The MFCNN algorithm was trained with 100 IVUS pullbacks (8427 manually segmented frames), was validated with 30 IVUS pullbacks (2583 manually segmented frames) and was blindly tested with 30 IVUS pullbacks (2425 manually segmented frames). Image and contour metrics were used to characterize model performance by comparing ground truth (GT) and machine learning (ML) contours.
Keywords:
Deep learning; Gaussian process; IVUS; Lumen; Segmentation; Vessel
Contacts :