Enhanced image super-resolution using hierarchical generative adversarial network
by Jianwei Zhao; Chenyun Fang; Zhenghua Zhou
International Journal of Computing Science and Mathematics (IJCSM), Vol. 15, No. 3, 2022

Abstract: Recently, generative adversarial networks (GAN) have been introduced in single-image super-resolution (SISR) to reconstruct more realistic high-resolution (HR) images. In this paper, we propose an effective SISR method, named super-resolution using hierarchical generative adversarial network (SRHGAN), based on the idea of GAN and the prior knowledge. Different from the existing GANs that focus on the depth of networks, our proposed method considers the prior knowledge in addition. That is, we introduce an edge extraction branch and an edge enhancement branch into GAN for considering the edge information. By means of the added edge loss in the loss function, the edge extraction branch and the edge enhancement branch will be trained to reconstruct the sharp edge well. Experimental results on several datasets illustrate that our reconstructed visual effect images are clearer and sharper than some related SISR methods.

Online publication date: Mon, 08-Aug-2022

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computing Science and Mathematics (IJCSM):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com