X-ray is the most widely used imaging modality for the initial diagnosis of bone tumors due to its accessibility and cost-effectiveness. However, the longitudinal comparison of benign bone tumors, particularly for assessing size and shape progression over time, remains largely manual and subjective. In this study, we propose FusionX-BBTNet, a deep learning-based framework that enables automated detection, segmentation, and time-sequential analysis of BBTs from X-ray images. The framework combines YOLO-based object detection with U-Net segmentation, and utilizes a novel wavelet-enhanced dataset to improve contour accuracy. To enable real-world quantification, an OCR-based module is used to extract the X-ray scale bar and compute the pixel-to-length conversion ratio. With this, tumor size and area are calculated in millimeters, and their changes over time are visualized through centroid-based alignment. The proposed method was validated on a dataset of 466 expert-annotated X-ray images, achieving a mean IoU of 0.9376 and a boundary F1 score of 0.9827. In addition to providing reliable tumor localization and measurement, the system supports clinical decision-making by offering intuitive shape and area comparisons. This approach has the potential to complement expert interpretation and improve diagnostic efficiency, especially in environments with limited radiological expertise.
Tracking temporal progression of benign bone tumors through X-ray based detection and segmentation
Se-Yeol Rhyou,Chohee Bang,Yongjin Cho,Hyunjae Bae,Y. Ha,Sohye Baek,Yeonhu Lee,Choonok Kim,Jeong Eun Moon
Published 2025 in Scientific Reports
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
Scientific Reports
- Publication date
2025-11-11
- Fields of study
Medicine, Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar, PubMed
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-31 of 31 references · Page 1 of 1
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1