The rapid advent of neural networks in the field of computer graphics have opened some interesting possibilities. Most examples of Deep Neural Style transfer I have seen focus on applying painterly styles to pictures or videos. I’ve seen plenty interesting results, but I cannot stomach to see yet another ‘Starry Night’ styled picture. Therefor I wanted to focus on a few other uses beyond the glorified Instagram filters.
Using a real photograph as a style picture could be a very handy tool for the digital matte painter. The detail-hungry matte painter frequently clone stamps or cuts/pastes from photographic reference to create the realistic detail their client expects.
Neural Style transfer seems to offer an amazing opportunity here. You can mock up the basic geometry with any tool, either 2d or 3d and apply the photorealistic style to the image. While the result is far from perfect, it offers a great foundation to improve with other photoshop techniques. The examples below are examples of what you could do with Mandelbulb3D as a base, with photo-styles applied.
click on the images for full screen, and use the arrows to switch from source to result.
You might wonder what I used as style images in above examples. For some, I used multiple and blended them, but basically I don't have the copyright, so can't publish. Just do a Google image search for 'Nature' or 'City' - that kind of stuff. I also used the technique that I talked about in my last blog post - a Google Similar Image search on the fractal source image.
How does Neural Style Transfer work?
If you do not understand how neural style transfer works, I’ll try to offer some insight. I will give you the usual disclaimer that I don’t really understand how it works under the hood. Instead, I will offer a more intuitive understanding of the process.
Deep learning is the active operator. Deep because the algorithm has many layers, and Learning because it improves over time.
Think of the many layers as a hierarchy in a large company. Every single person can only do so much, and at the bottom of the ladder a person has little insight and little power. One layer up the people manage the people at the bottom, and they have some insight and more interesting strategies apply. This continues to be true all the way to the top, where the individual person is not per se a smarter person, but their decisions set off a much more complex cascade of actions They have many strategies to choose from as their operations are much more delicate.
In the style transfer program, the bottom layers only operate on the pixel level, which won’t create anything interesting at all. But keep adding layers (my understanding is that the algorithm I use contains 19) and at the top we get a larger-scale artificial ‘understanding’ of the image.
So what’s the image that the algorithm ‘understands’? What happens next is still magical to me, but both the source image and the style image are interpreted in this way, and a random algorithm is tried to match one to the other. The first random try won’t result in anything useful, so the computer measures how big this ‘error’ is and tries another one that will decrease this error. This continuously decreasing of the error is the ‘learning’ part. After a few hundred or thousands of iterations, this error is sufficiently low, and behold – style transfer has taken place.
Fractal Art with Neural Style
Apart from the (somewhat trippy) matte painting examples above, there are of course many more options. One of them is to apply a style to an infinite fractal zoom, as I did in my latest Fractal short, Recurrence. A clever Optical Flow system ensures that there’s stability over time, but at the same time the scale of the style will keep looking similar, just as with a fractal. Unfortunately, it did take an extremely long time to process this video, and I hope that in the future faster techniques will be made available.
Where's the software?
Finally you may wonder how you can do this Style Transfer yourself. There are a couple of websites, such as deepart.io that offer the service. If you want to do it yourself, be prepared for a lot of work - you need to install a lot of code before it will work.
I used this implementation by Cameron Smith, which gave me the best result and flexibility. It also supports video.
I believe that these techniques have the future, so artist-friendly implementations for these things will hopefully come soon. I for one would definitely be willing to pay for such visual wonder.