Defending Deep Neural Networks against Structural Pertubations

Sinha, Uttaran and Joshi, Saurabh (2019) Defending Deep Neural Networks against Structural Pertubations. Masters thesis, Indian institute of technology Hyderabad.

[img] Text
Restricted to Repository staff only until 9 July 2021.

Download (3MB) | Request a copy


Deep learning has had a tremendous impact in the field of computer vision. However, the deployment of such algorithms in real-world environments hinges upon its robustness to noise. This thesis focuses on testing robustness of a model against naturally occurring structural perturbations to ensure the safety of deep learning in multiple domains such as facial recognition, automated driving and object detection. To this end, given a dataset, we propose a systematic way to defend against such attacks. We evaluate various hyperparameters of a model and analyse the results to ascertain their causality on robustness. Subsequently. We propose a method to boost the stability of a model via training set augmentation, also known as adversarial training. The augmentation relies on a coreset set cover approach to cover six independent structural perturbations and their combinations. We introduce and compare two different strategies with time vs data tradeoff without compromising model robustness or performance. Also, we analyse the effect of adversarial training on the decision boundary of the model. This work primarily focuses on image classification and, we believe that our algorithm works independently of the model architecture. This approach is put to the test on different datasets and model architectures and compared against state-of-the-art defence against structural perturbations.

[error in script]
IITH Creators:
IITH CreatorsORCiD
Item Type: Thesis (Masters)
Uncontrolled Keywords: DNN, Adversarial training, Structural noise
Subjects: Computer science
Divisions: Department of Computer Science & Engineering
Depositing User: Team Library
Date Deposited: 09 Jul 2019 09:14
Last Modified: 09 Jul 2019 09:14
Publisher URL:
Related URLs:

    Actions (login required)

    View Item View Item
    Statistics for RAIITH ePrint 5683 Statistics for this ePrint Item