Please use this identifier to cite or link to this item:
http://drsr.daiict.ac.in//handle/123456789/1096
Title: | Generating Targeted Adversarial Attacks and Assessing their Effectiveness in Fooling Deep Neural Networks |
Authors: | Bhilare, Shruti Hati, Avik Gajjar, Shivangi Bharatbhai |
Keywords: | Deep Neural Network algorithm Targeted DeepFool |
Issue Date: | 2022 |
Publisher: | Dhirubhai Ambani Institute of Information and Communication Technology |
Citation: | Gajjar, Shivangi Bharatbhai (2022). Generating Targeted Adversarial Attacks and Assessing their Effectiveness in Fooling Deep Neural Networks. Dhirubhai Ambani Institute of Information and Communication Technology. viii, 37 p. (Acc. # T01016). |
Abstract: | Deep neural network (DNN) models have gained popularity for most image classification problems. However, DNNs also have numerous vulnerable areas. These vulnerabilities can be exploited by an adversary to execute a successful adversarial attack, which is an algorithm to generate perturbed inputs that can fool a welltrained DNN. Among various existing adversarial attacks, DeepFool, a whitebox untargeted attack is considered as one of the most reliable algorithms to compute adversarial perturbations. However, in some scenarios such as person recognition, adversary might want to carry out a targeted attack such that the input gets misclassified in a specific target class. Moreover, studies show that defense against a targeted attack is tougher than an untargeted one. Hence, generating a targeted adversarial example is desirable from an attacker�s perspective. In this thesis, we propose �Targeted DeepFool�, which is based on computing a minimal amount of perturbation required to reach the target hyperplane. The proposed algorithm produces minimal amount of distortion for conventional image datasets: MNIST and CIFAR10. Further, Targeted DeepFool shows excellent performance in terms of adversarial success rate. |
URI: | http://drsr.daiict.ac.in//handle/123456789/1096 |
Appears in Collections: | M Tech Dissertations |
Files in This Item:
File | Size | Format | |
---|---|---|---|
202011023.pdf | 2.75 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.