Non-stationary Texture Synthesis by Adversarial Expansion

ACM Transactions on Graphics (Proceedings of SIGGRAPH 2018)


Yang Zhou1,2           Zhen Zhu2,+          Xiang Bai2         Dani Lischinski3            Daniel Cohen-Or4           Hui Huang1*

1Shenzhen University            2Huazhong University of Science & Technology          3The Hebrew University of Jerusalem         4Tel-Aviv University    

+Joint First Author       *Corresponding author


Fig. 1. Examples of two extremely challenging non-stationary textures (middle column), synthesized by our method (left and right). Note that our method succeeds in reproducing and extending the global structure and trends present in the input exemplars.



Abstract 

The real world exhibits an abundance of non-stationary textures. Examples include textures with large scale structures, as well as spatially variant and inhomogeneous textures. While existing example-based texture synthesis methods can cope well with stationary textures, non-stationary textures still pose a considerable challenge, which remains unresolved. In this paper, we propose a new approach for example-based non-stationary texture synthesis. Our approach uses a generative adversarial network (GAN), trained to double the spatial extent of texture blocks extracted from a specific texture exemplar. Once trained, the fully convolutional generator is able to expand the size of the entire exemplar, as well as of any of its sub-blocks. We demonstrate that this conceptually simple approach is highly effective for capturing large scale structures, as well as other non-stationary attributes of the input exemplar. As a result, it can cope with challenging textures, which, to our knowledge, no other existing method can handle.




Fig. 8. Diversification by cropping. For each source texture (left in each triplet), we randomly crop two 256×256 sub-regions from the source texture after training, to generate different expansion results on size 512×512.


Fig. 9. Diversification by tile shuffling. The exemplar used to train the generator (leftmost column) is divided into tiles, which are randomly reshuffled before feeding into the generator, yielding different results.


Fig. 12. Stress test #1. Given a source texture (leftmost column), we double its size using our method. Then we randomly crop a region of the same size as the source texture from the expansion result, and expand it again without any further training. The above crop-expansion cycle is repeated 4 times. We can see that the final result (rightmost column) is still very sharp and natural looking, attesting to the stability of our method.


Fig. 13. Extreme expansion. Having trained a generator on the source exemplar (left), we feed it with a small cropped texture block (64×64 pixels), and feed the expanded result back into the generator. Five such cycles produce a 2048×2048 result. Six different crops from this result are shown in the bottom row.


Fig. 15. Texture transfer. By feeding generators trained using the texture exemplars in the top row with guiding images in the leftmost column we synthesize textures that adapt to the large scale structures present in the guiding images. Note that we can even input a simple user sketch or pure random noise (Perlin noise) and generate satisfactory results as shown in the last two rows.



Data & Code

To reference our ALGORITHM, CODE, DATA or RESULTS in any publication, Please include the bibtex below.

Link:https://github.com/jessemelpolio/non-stationary_texture_syn


SUPPLEMENTARY

Link:  http://vcc.szu.edu.cn/resources/SuppleMaterials-TexSyn18/Supplementary.html


ACKNOWLEDGMENTS
We thank the anonymous reviewers for their valuable comments. This work was supported in part by NSFC (61522213, 61761146002, 6171101466), 973 Program (2015CB352501), Guangdong Science and Technology Program (2015A030312015), Israel Science Foundation (2366/16), ISF-NSFC Joint Research Program (2217/15, 2472/17), and Shenzhen Innovation Program (KQJSCX20170727101233642, JCYJ20151015151249564).


Bibtex

@article{TexSyn18,

title = {Non-stationary Texture Synthesis by Adversarial Expansion},
author = {Yang Zhou and Zhen Zhu and Xiang Bai and Dani Lischinski and Daniel Cohen-Or and Hui Huang},
journal = {ACM Transactions on Graphics (Proc. SIGGRAPH)},
volume = {37},
number = {4},
pages = {49:1--49:13},  
year = {2018},
}

Downloads