Since prostate cancer is one of the most important causes of death in today’s society, the investigation of why and how to diagnose and predict it has received much attention from researchers. The cooperation of computer and medical experts provides a new solution in analyzing these data and obtaining useful and practical models, which is deep learning. In fact, deep learning as one of the most important tools for analyzing data and discovering relationships between them and predicting the occurrence of events is one of the practical tools of researchers in this way. This study segments and classifies prostate cancer using a deep learning approach and architectures tested in the ImageNet dataset and based on a method to identify factors affecting this disease. In the proposed method, after increasing the number of data based on removing dominant noises in MRI images, image segmentation using a network based on deep learning called faster R-CNN, and then feature extraction and classification with architecture Various deep learning networks have reached the appropriate accuracy and speed in detection and classification. The aim of this study is to reduce unnecessary biopsies and to choose and plan treatment to help the doctor and the patient. Achieving the minimum error in the diagnosis of malignant lesion with a criterion called Sensitivity of 93.54% and AUC equal to 95% with the ResNet50 architecture has achieved the goal of this research.
Transfer learning; powerful and fast segmentation and classification prostate cancer from MRI scans, in the development set
Neda Pirzad Mashak,G. Akbarizadeh,E. Farshidi
Published 2023 in Journal of Intelligent & Fuzzy Systems
ABSTRACT
PUBLICATION RECORD
- Publication year
2023
- Venue
Journal of Intelligent & Fuzzy Systems
- Publication date
2023-05-19
- Fields of study
Medicine, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-43 of 43 references · Page 1 of 1