Classification Analysis of Back propagation-Optimized CNN Performance in Image Processing

  • Putrama Alkhairi STIKOM Tunas Bangsa
  • Agus Perdana Windarto STIKOM Tunas Bangsa
Keywords: CNN, VGG16 Architecture, Back propagation, Optimization, Classification, Animals

Abstract

This study aims to optimize the performance of the Convolutional Neural Network (CNN) in the image classification task by applying data augmentation and fine-tuning techniques to a case study of mammal classification. In this study, we took a fairly complex image classification dataset and used the CNN model as a basis for training and evaluating the performance of the model compared to Back propagation. From this study, the CNN VGG16 architecture optimized with ADAM optimization has been compared with the Back propagation optimization of SGD. We also conducted a literature review on several related studies and basic concepts in CNN, such as convolution, pooling, and fully connected layers. The research methodology involves creating datasets using data augmentation techniques, model training using fine-tuning techniques, and testing model performance using a number of evaluation metrics, including accuracy, precision, and recall. The results of this study indicate that the techniques used have succeeded in improving the performance of the CNN model in complex image classification tasks with accuracy in identifying and monitoring animal species more accurately, with an accuracy of 91.18% for the best model. Model accuracy increased by 2% after applying data augmentation and fine-tuning techniques to the CNN model. These results indicate that the techniques applied in this study can be a good alternative in improving the performance of the CNN model in the image classification task.

Downloads

Download data is not yet available.

References

A. A. SG. Mas Karunia Maharani, Komang Oka Saputra, and Ni Made Ary Esta Dewi Wirastuti, “Komparasi Metode Backpropagation Neural Network dan Convolutional Neural Network Pada Pengenalan Pola Tulisan Tangan,” J. Comput. Sci. Informatics Eng., vol. 6, no. 1, pp. 56–63, 2022, doi: 10.29303/jcosine.v6i1.431.

Y. Liu, Y. Wang, R. Yu, M. Li, V. Sharma, and Y. Wang, “Optimizing CNN model inference on CPUs,” Proc. 2019 USENIX Annu. Tech. Conf. USENIX ATC 2019, pp. 1025–1039, 2019.

N. Irtiza Trinto and M. Eunus Ali, “Detecting Multilabel Sentiment and Emotions from Bangla YouTube Comments,” 2018 Int. Conf. Bangla Speech Lang. Process. ICBSLP 2018, no. January 2019, 2018, doi: 10.1109/ICBSLP.2018.8554875.

R. Leluc and F. Portier, “SGD with Coordinate Sampling: Theory and Practice,” vol. 23, pp. 1–47, 2021, [Online]. Available: http://arxiv.org/abs/2105.11818

H. Kör, H. Erbay, and A. H. Yurttakal, “Diagnosing and differentiating viral pneumonia and COVID-19 using X-ray images,” Multimed. Tools Appl., vol. 81, no. 27, pp. 39041–39057, 2022, doi: 10.1007/s11042-022-13071-z.

X. Yan, Y. Xu, X. Xing, B. Cui, Z. Guo, and T. Guo, “Trustworthy Network Anomaly Detection Based on an Adaptive Learning Rate and Momentum in IIoT,” IEEE Trans. Ind. Informatics, vol. 16, no. 9, pp. 6182–6192, 2020, doi: 10.1109/TII.2020.2975227.

L. Gaur, U. Bhatia, N. Z. Jhanjhi, G. Muhammad, and M. Masud, “Medical image-based detection of COVID-19 using Deep Convolution Neural Networks,” Multimed. Syst., no. 0123456789, 2021, doi: 10.1007/s00530-021-00794-6.

A. Hedström et al., “Quantus: An Explainable AI Toolkit for Responsible Evaluation of Neural Network Explanations,” vol. 24, pp. 1–11, 2022, [Online]. Available: http://arxiv.org/abs/2202.06861

M. P. Covid-, “Optimasi Deep Learning untuk Prediksi Saham di,” vol. 7, no. 2, pp. 133–140, 2021.

M. I. Jordan, T. Lin, and M. Zampetakis, “First-Order Algorithms for Nonlinear Generalized Nash Equilibrium Problems,” vol. 24, pp. 1–46, 2022, [Online]. Available: http://arxiv.org/abs/2204.03132

R. Kumar et al., “Classification of COVID-19 from chest x-ray images using deep features and correlation coefficient,” Multimed. Tools Appl., vol. 81, no. 19, pp. 27631–27655, 2022, doi: 10.1007/s11042-022-12500-3.

E. M. Achour, F. Malgouyres, and F. Mamalet, “Existence, Stability and Scalability of Orthogonal Convolutional Neural Networks,” vol. 23, pp. 1–56, 2021, [Online]. Available: http://arxiv.org/abs/2108.05623

S. Kiziloluk and E. Sert, “COVID-CCD-Net: COVID-19 and colon cancer diagnosis system with optimized CNN hyperparameters using gradient-based optimizer,” Med. Biol. Eng. Comput., vol. 60, no. 6, pp. 1595–1612, 2022, doi: 10.1007/s11517-022-02553-9.

F. Ceccon et al., “OMLT: Optimization & Machine Learning Toolkit,” vol. 23, pp. 1–8, 2022, [Online]. Available: http://arxiv.org/abs/2202.02414

B. Xu, D. Martín, M. Khishe, and R. Boostani, “COVID-19 diagnosis using chest CT scans and deep convolutional neural networks evolved by IP-based sine-cosine algorithm,” Med. Biol. Eng. Comput., vol. 60, no. 10, pp. 2931–2949, 2022, doi: 10.1007/s11517-022-02637-6.

I. Arrieta-ibarra, P. Gujral, J. Tannen, M. Tygert, and C. Xu, “Metrics of calibration for probabilistic predictions,” Tech. Rep., vol. 23, pp. 1–50, 2022.

T. Yuan, W. Liu, J. Han, and F. Lombardi, “High Performance CNN Accelerators Based on Hardware and Algorithm Co-Optimization,” IEEE Trans. Circuits Syst. I Regul. Pap., vol. 68, no. 1, pp. 250–263, 2021, doi: 10.1109/TCSI.2020.3030663.

D. Irfan and T. S. Gunawan, “COMPARISON OF SGD , RMSProp , AND ADAM OPTIMATION IN ANIMAL CLASSIFICATION USING CNNs,” 2nd Int. Conf. Infromation Sci. anda Technol. Innov., 2023.

L. Yu et al., “Prediction of pathologic stage in non-small cell lung cancer using machine learning algorithm based on CT image feature analysis,” BMC Cancer, vol. 19, no. 1, pp. 1–12, 2019, doi: 10.1186/s12885-019-5646-9.

H. Wang et al., “Comparison of machine learning methods for classifying mediastinal lymph node metastasis of non-small cell lung cancer from 18F-FDG PET/CT images,” EJNMMI Res., vol. 7, no. 1, 2017, doi: 10.1186/s13550-017-0260-9.

S. S. A. Laros, D. B. M. Dickerscheid, S. P. Blazis, and J. A. van der Heide, “Machine learning classification of mediastinal lymph node metastasis in NSCLC: a multicentre study in a Western European patient population,” EJNMMI Phys., vol. 9, no. 1, 2022, doi: 10.1186/s40658-022-00494-8.

N. Hasan, Y. Bao, A. Shawon, and Y. Huang, “DenseNet Convolutional Neural Networks Application for Predicting COVID-19 Using CT Image,” SN Comput. Sci., vol. 2, no. 5, pp. 1–11, 2021, doi: 10.1007/s42979-021-00782-7.

J. Tachella, D. Chen, and M. Davies, “Sensing Theorems for Unsupervised Learning in Linear Inverse Problems,” vol. 24, pp. 1–45, 2022, [Online]. Available: http://arxiv.org/abs/2203.12513

L. T. K. Hien, D. N. Phan, and N. Gillis, “An Inertial Block Majorization Minimization Framework for Nonsmooth Nonconvex Optimization,” vol. 24, no. 30468160, pp. 1–41, 2020, [Online]. Available: http://arxiv.org/abs/2010.12133

N. Ho, C.-Y. Yang, and M. I. Jordan, “Convergence Rates for Gaussian Mixtures of Experts,” vol. 23, pp. 1–81, 2019, [Online]. Available: http://arxiv.org/abs/1907.04377

T. Ikeuchi, “Python package for causal discovery based on LiNGAM,” vol. 24, pp. 1–8, 2023.

K. Ji and Y. Liang, “Lower Bounds and Accelerated Algorithms for Bilevel Optimization,” vol. 23, pp. 1–56, 2021, [Online]. Available: http://arxiv.org/abs/2102.03926

A. Mahillo and D. Mart, “Discrete Variational Calculus for Accelerated Optimization,” vol. 24, pp. 1–33, 2023.

M. Unser, “Ridges , Neural Networks , and the Radon Transform,” vol. 24, pp. 1–33, 2023.

S. Duan and J. C. Principe, “Labels, Information, and Computation: Efficient, Privacy-Preserving Learning Using Sufficient Labels,” vol. 24, pp. 1–35, 2021, [Online]. Available: http://arxiv.org/abs/2104.09015

Published
2023-03-31
How to Cite
Putrama Alkhairi, & Windarto, A. P. (2023). Classification Analysis of Back propagation-Optimized CNN Performance in Image Processing. Journal of Systems Engineering and Information Technology (JOSEIT), 2(1), 8-15. https://doi.org/10.29207/joseit.v2i1.5015
Section
Articles