Scholarly open access journals, Peer-reviewed, and Refereed Journals, Impact factor 8.14 (Calculate by google scholar and Semantic Scholar | AI-Powered Research Tool) , Multidisciplinary, Monthly, Indexing in all major database & Metadata, Citation Generator, Digital Object Identifier(DOI)
Deep learning & especially Convolutional Neural Networks (CNNs) are taking a different shape in the area of Image Recognition & Classification. Performance of any CNN model depends on various parameters such as size of the dataset, number of classes, weights of the model, hypermeters and mainly on optimizers. Generally, optimizers are used to optimize the model parameters in any learning algorithm. The purpose of an optimizer is to adjust model weights to maximize a loss function. The loss function is used as a way to measure how well the model is performing. In order to reduce the loss and increase the accuracy, we are using the optimizers. In this project, we are doing a comparative analysis of optimizers like mini-batch GD, momentum GD, RMS prop Adam, Adagrad and Adadelta on datasets like MNIST, CIFAR10, Kaggle Flowers. The comparison will be made between the loss and accuracy at every epoch.
Keywords:
Deep Learning, Optimizers, SGD, Adam, RMS prop, Adagrad, Adadelta
Cite Article:
"Comparative Analysis on Deep Learning Optimization Techniques", International Journal of Science & Engineering Development Research (www.ijrti.org), ISSN:2455-2631, Vol.8, Issue 6, page no.1091 - 1096, June-2023, Available :http://www.ijrti.org/papers/IJRTI2306161.pdf
Downloads:
000205164
ISSN:
2456-3315 | IMPACT FACTOR: 8.14 Calculated By Google Scholar| ESTD YEAR: 2016
An International Scholarly Open Access Journal, Peer-Reviewed, Refereed Journal Impact Factor 8.14 Calculate by Google Scholar and Semantic Scholar | AI-Powered Research Tool, Multidisciplinary, Monthly, Multilanguage Journal Indexing in All Major Database & Metadata, Citation Generator