
ACM Transactions on Graphics (Proceedings of SIGGRAPH 2018)
Yang Zhou1,2 Zhen Zhu2,+ Xiang Bai2 Dani Lischinski3 Daniel Cohen-Or1,4 Hui Huang1*
1Shenzhen University 2Huazhong University of Science & Technology 3The Hebrew University of Jerusalem 4Tel-Aviv University
+Joint First Author *Corresponding author
Fig. 1. Examples of two extremely challenging non-stationary textures (middle column), synthesized by our method (left and right). Note that our method succeeds in reproducing and extending the global structure and trends present in the input exemplars.
Abstract
Fig. 8. Diversification by cropping. For each source texture (left in each triplet), we randomly crop two 256×256 sub-regions from the source texture after training, to generate different expansion results on size 512×512.
Fig. 12. Stress test #1. Given a source texture (leftmost column), we double its size using our method. Then we randomly crop a region of the same size as the source texture from the expansion result, and expand it again without any further training. The above crop-expansion cycle is repeated 4 times. We can see that the final result (rightmost column) is still very sharp and natural looking, attesting to the stability of our method.
Fig. 13. Extreme expansion. Having trained a generator on the source exemplar (left), we feed it with a small cropped texture block (64×64 pixels), and feed the expanded result back into the generator. Five such cycles produce a 2048×2048 result. Six different crops from this result are shown in the bottom row.
Fig. 15. Texture transfer. By feeding generators trained using the texture exemplars in the top row with guiding images in the leftmost column we synthesize textures that adapt to the large scale structures present in the guiding images. Note that we can even input a simple user sketch or pure random noise (Perlin noise) and generate satisfactory results as shown in the last two rows.
Data & Code
Note that the DATA and CODE are free for Research and Education Use ONLY.
Please cite our paper (add the bibtex below) if you use any part of our ALGORITHM, CODE, DATA or RESULTS in any publication.
Link:https://github.com/jessemelpolio/non-stationary_texture_syn
Supplementary
Link: http://vcc.szu.edu.cn/resources/SuppleMaterials-TexSyn18/Supplementary.html
Bibtex
@article{TexSyn18,