Title: Research on image to illustration translation method based on CycleGAN

Authors: Yuhan Wei; Mingyu Ji; Jian Lv; Xinhai Zhang

Addresses: University of Northeast Forestry, Harbin, China ' University of Northeast Forestry, Harbin, China ' University of Northeast Forestry, Harbin, China ' University of Northeast Forestry, Harbin, China

Abstract: Aiming at the problem that balance between abstract style and original painting content is not enough in traditional image style migration, this paper puts forward an improved method that based on the traditional Cyclic-Consistent Generation Adversarial Networks (CycleGAN), which enables the generator to subsample the feature map of each residual layer, and uses skip link and upsampling to merge the low-level feature and the advanced feature. Then the image averaging operation is used to enhance contrast of the generated images, and weighted mean filter is used to filter out redundant details. Finally, the grey level of the image is reduced in order to improve its abstraction. The experimental results show that with CycleGAN, DualGAN, CartoonGAN compared three methods of migration image style, this model not only improves the image style of abstract degree, also a better retain the original image content, significantly improve the processing and transfer the balance between the content of the original painting abstract style.

Keywords: GAN; image to image translation; illustrations to generate; deep learning.

DOI: 10.1504/IJCAT.2022.127822

International Journal of Computer Applications in Technology, 2022 Vol.69 No.3, pp.244 - 252

Received: 14 Jul 2021
Accepted: 17 Oct 2021

Published online: 19 Dec 2022 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article