Show simple item record

dc.contributor.advisorJoshi, Manjunath V.
dc.contributor.authorBhaisaheb, Shabbirhussain Hamid
dc.date.accessioned2020-09-22T14:41:19Z
dc.date.available2023-02-16T14:41:19Z
dc.date.issued2020
dc.identifier.citationBhaisaheb, Shabbirhussain Hamid (2020). Optimization techniques for deep neural networks. Dhirubhai Ambani Institute of Information and Communication Technology. vii, 44 p. (Acc.No: T00848)
dc.identifier.urihttp://drsr.daiict.ac.in//handle/123456789/926
dc.description.abstractOptimization techniques are used to make the neural networks to converge to an optimal solution. Different techniques such as gradient-based approaches are utilized to obtain the parametric solution having objective function estimation to be minimum. The most commonly used method is Gradient Descent (GD) for this purpose, but there are certain drawbacks to this method such as there is no guarantee that it will converge to the global minima of the objective function, the convergence rate depends on the learning rate α (alpha) and wrong selection of which may lead to divergence. The solution depends on the parameter initialization and may differ with different initial values of the parameters, and also, the underlying assumption is that function should be differentiable, which may not always be possible. There are other non-gradient based optimization techniques such as genetic algorithms, graph cut method, particle swarm optimization, simulated annealing. Each of these techniques has their respective drawbacks and advantages. For the current research work, simulated annealing was kept in focus given it guarantees global optimization theoretically (unlike other methods). The gradient-based optimization methods and simulated annealing (SA) algorithm are used on different neural net architectures to check the running time of the algorithm and optimized parameter values indicating convergence of objective function.
dc.subjectOptimization
dc.subjectGradient-based method
dc.subjectReinforcement learning
dc.subjectSimulated annealing
dc.classification.ddc006.32 BHA
dc.titleOptimization techniques for deep neural networks
dc.typeDissertation
dc.degreeM. Tech
dc.student.id201811017
dc.accession.numberT00848


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record