Please use this identifier to cite or link to this item: http://drsr.daiict.ac.in//handle/123456789/965
Title: Bayesian networks : Bayesian learning in neural networks and autoencoders
Authors: Mitra, Suman K.
Sadrani, Riya D.
Keywords: Bayesian learning
Bayesian inference
Variational posterior
Bayesian networks
Autoencoders
Issue Date: 2020
Publisher: Dhirubhai Ambani Institute of Information and Communication Technology
Citation: Sadrani, Riya D. (2020). Bayesian networks : Bayesian learning in neural networks and autoencoders. Dhirubhai Ambani Institute of Information and Communication Technology. vii, 38 p. (Acc.No: T00884)
Abstract: In traditional neural networks, we have fixed weights and biases that determine how input is transformed into an output. In a Bayesian Neural Network (BNN), all weights and biases have a probability distribution attached to them. To classify an image, we do multiple runs (forward passes) of the network, each time with a new set of sampled weights and biases. Instead of a single set of output values, we get multiple sets, one for each of the several runs. The set of output values represents a probability distribution on it. In recent literature, a new algorithm was introduced for learning a probability distribution on the weights and biases of a neural network, called Bayes by Backprop. In this work, the accuracies of conventional neural networks (NN) and BNN on MNIST classification are studied. Evaluation of implementation shows that BNN gives better results than using conventional NN. Additionally, in this work, Bayes by Backprop is modified for Autoencoders and implemented Bayesian AutoEncoder (BAE) by changing configuration and loss function of the existing algorithm.
URI: http://drsr.daiict.ac.in//handle/123456789/965
Appears in Collections:M Tech Dissertations

Files in This Item:
File Description SizeFormat 
201811058.pdf
  Restricted Access
2.26 MBAdobe PDFView/Open Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.