Exploiting spatial relation for reducing distortion in style transfer

Jia Ren Chang, Yong Sheng Chen

研究成果: Conference contribution同行評審

3 引文 斯高帕斯(Scopus)

摘要

The power of convolutional neural networks in arbitrary style transfer has been amply demonstrated; however, existing stylization methods tend to generate spatially inconsistent results with noticeable artifacts. One solution to this problem involves the application of a segmentation mask or affinity-based image matting to preserve spatial information related to image content. The main idea of this work is to model spatial relation between content image pixels and thus to maintain this relationship in stylization for reducing artifacts. The proposed network architecture is called spatial relation-augmented VGG (SRVGG), in which long-range spatial dependency is modeled by a spatial relation module. Based on this spatial information extracted from SRVGG, we design a novel relation loss which can minimize the difference of spatial dependency between content images and stylizations. We evaluate the proposed framework on both optimization-based and feedforward-based style transfer methods. The effectiveness of SRVGG in stylization is demonstrated by generating stylized images of high quality and spatial consistency without the need for segmentation masks or affinity-based image matting. The quantitative evaluation also suggests that the proposed framework achieve better performance compared with other methods.

原文English
主出版物標題Proceedings - 2021 IEEE Winter Conference on Applications of Computer Vision, WACV 2021
發行者Institute of Electrical and Electronics Engineers Inc.
頁面1209-1217
頁數9
ISBN(電子)9780738142661
DOIs
出版狀態Published - 1月 2021
事件2021 IEEE Winter Conference on Applications of Computer Vision, WACV 2021 - Virtual, Online, 美國
持續時間: 5 1月 20219 1月 2021

出版系列

名字Proceedings - 2021 IEEE Winter Conference on Applications of Computer Vision, WACV 2021

Conference

Conference2021 IEEE Winter Conference on Applications of Computer Vision, WACV 2021
國家/地區美國
城市Virtual, Online
期間5/01/219/01/21

指紋

深入研究「Exploiting spatial relation for reducing distortion in style transfer」主題。共同形成了獨特的指紋。

引用此