Deep convolutional neural networks for the detection of macular diseases from optical coherence tomography images


Waluya, S. B. (2020) Deep convolutional neural networks for the detection of macular diseases from optical coherence tomography images. Journal of Physics: Conf. Series, 1613. pp. 1-6. ISSN 1742-6588

[thumbnail of 6.1-3.Deep convolutional neural networks for the detection of macular diseases from optical coherence tomography images.pdf] PDF
Download (491kB)

Abstract

The purpose of this research is to design a system to recognize CNV image, DME, Drusen, and Normal, which were produced by Optical Coherence Tomography (OCT). A system would provide a training model, evaluation, and accuracy value. We used the Convolution Neural Network method with default parameter 50 epoch, one stride, 83484 train data images, and learning rate value 0.001 with the help of Python 3.7 software. The examination using epoch variation, stride, number of train data, and learning rate value resulted in different accuracy values. According to epoch variation, the best accuracy was 50 epoch with an accuracy value of 0.99 and loss validation of 0.2034. The best accuracy of stride value variation was one stride with an accuracy value of 0.99 and loss validation of 0.2267. The best accuracy of train data variation was 83484 images with an accuracy value of 0.99 and a loss validation value of 0.2524. The learning rate variation value with the best accuracy was 0.0001, with an accuracy value of 0.992 and validation loss value of 0.2524. According to the result of the research, it was obtained that convolution neural network architecture gain the best model with accuracy value 0.992 according to variation parameter 50 epoch, one stride, 83484 train data images, and learning rate 0.0001.

Item Type: Article
Subjects: Q Science > QA Mathematics
Fakultas: UNSPECIFIED
Depositing User: Repositori Dosen Unnes
Date Deposited: 08 May 2023 06:49
Last Modified: 08 May 2023 06:49
URI: http://lib.unnes.ac.id/id/eprint/58077

Actions (login required)

View Item View Item