Please use this identifier to cite or link to this item:
http://drsr.daiict.ac.in//handle/123456789/1095
Title: | Image Super-Resolution by Combining Non-Local Sparse Attention and Residual Channel Attention |
Authors: | Mandal, Srimanta Bhavsar, Manali Hiteshkumar |
Keywords: | Deep Learning Techniques Single Image Super Resolution Channel Attention Non-local Sparse Attention |
Issue Date: | 2022 |
Publisher: | Dhirubhai Ambani Institute of Information and Communication Technology |
Citation: | Bhavsar, Manali Hiteshkumar (2022). Image Super-Resolution by Combining Non-Local Sparse Attention and Residual Channel Attention. Dhirubhai Ambani Institute of Information and Communication Technology. vii, 37 p. (Acc. # T01015). |
Abstract: | Single Image Super Resolution (SISR) is an illposed problem that aims to gener ate a high resolution (HR) image from a single low resolution (LR) image. The low resolution image and its associated features are very rich in low frequency information. The main objective of super resolution is to add relevant high frequency detail to complement the available low frequency information. Classical techniques such as nonlocal similarity and sparse representations both have shown promising results in SISR task in past decades. Nowadays, deep learning techniques such as convolutional neural networks (CNN) can extract deepeatures to improve the results of SISR task. However, CNN does not explicitly consider the similar information in the image. Hence, we employ non local sparse attention (NLSA) module in the CNN framework such that it can explore the non local similarity within an image. We consider sparsity in the non local operation by focusing on a particular group named attention bin among many groups of features. Non local Sparse Attention is intended to retain the longrange of non local operation modeling capacity while benefiting from the efficiency and robustness of sparse representation. Additionally, we try to rescale the channel specific features adaptively while taking into account channel interdependence by using residual channel attention. In this thesis work, we try to incorporate and combine the advantages of non local sparse attention (NLSA) and residual channel attention to produce results similar to state of the art methods. |
URI: | http://drsr.daiict.ac.in//handle/123456789/1095 |
Appears in Collections: | M Tech Dissertations |
Files in This Item:
File | Size | Format | |
---|---|---|---|
202011022.pdf | 7.25 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.