Volume 34, Issue 1 (2-2020)                   Med J Islam Repub Iran 2020 | Back to browse issues page


XML Print


Eye Research Center, The Five Senses Institute, Iran University of Medical Sciences, Tehran, Iran , mirshahi.r@iums.ac.ir
Abstract:   (2103 Views)
Background: Lung CT scan has a pivotal role in diagnosis and monitoring of COVID-19 patients, and with growing number of affected individuals, the need for artificial intelligence (AI)-based systems for interpretation of CT images is emerging. In current investigation we introduce a new deep learning-based automatic segmentation model for localization of COVID-19 pulmonary lesions.
   Methods: A total of 2469 CT scan slices, containing 1402 manually segmented abnormal and 1067 normal slices form 55 COVID-19 patients and 41 healthy individuals, were used to train a deep convolutional neural network (CNN) model based on Detectron2, an open-source modular object detection library. A dataset, including 1224 CT slices of 18 COVID-19 patients and 9 healthy individuals, was used to test the model.
   Results: The accuracy, sensitivity, and specificity of the trained model in marking a single image slice with COVID-19 lesion were 0.954, 0.928, and 0.961, respectively. Considering a threshold of 0.4% for percentage of lung involvement, the model was capable of diagnosing the patients with COVID-19 pneumonia, with a sensitivity of 0.982% and a specificity of 88.5%. Furthermore, the mean Intersection over Union (IoU) index for the test dataset was 0.865.
   Conclusion: The deep learning-based automatic segmentation method provides an acceptable accuracy in delineation and localization of COVID-19 lesions, assisting the clinicians and researchers for quantification of abnormal findings in chest CT scans. Moreover, instance segmentation is capable of monitoring longitudinal changes of the lesions, which could be beneficial to patients’ follow-up.
Full-Text [PDF 1337 kb]   (613 Downloads)    
Type of Study: Original Research | Subject: Radiology

Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.