# Neural Materials Are Amazing! 🔮

## Метаданные

- **Канал:** Two Minute Papers
- **YouTube:** https://www.youtube.com/watch?v=1F-WnarzkX8
- **Дата:** 21.07.2021
- **Длительность:** 6:40
- **Просмотры:** 134,899
- **Источник:** https://ekstraktznaniy.ru/video/13867

## Описание

❤️ Check out Weights & Biases and sign up for a free demo here: https://wandb.com/papers 
❤️ Their mentioned post is available here: https://wandb.ai/stacey/xray/reports/X-Ray-Illumination--Vmlldzo4MzA5MQ

📝 The paper "NeuMIP: Multi-Resolution Neural Materials" is available here:
https://cseweb.ucsd.edu/~viscomp/projects/NeuMIP/

📝  Our latent space technique:
https://users.cg.tuwien.ac.at/zsolnai/gfx/gaussian-material-synthesis/

📝  Our “Photoshop” technique:
https://users.cg.tuwien.ac.at/zsolnai/gfx/photorealistic-material-editing/

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Haro, Andrew Melnychuk, Angelos Evripiotis, Benji Rabhan, Bryan Learn, Christian Ahlin, Eric Haddad, Eric Martel, Gordon Child, Ivo Galic, Jace O'Brien, Javier Bustamante, John Le, Jonas, Kenneth Davis, Klaus Busse, Lorin Atzberger, Lukas Biewald, Matthew Allen Fisher, Mark Oates, Michael Albrecht, Nikhil Velpanur, Owen Campbell-Moore, O

## Транскрипт

### Introduction []

Dear Fellow Scholars, this is Two Minute Papers with Dr. Károly Zsolnai-Fehér. This image is the result of a light simulation program, created by research scientists. It looks absolutely beautiful, but the light simulation algorithm is only part of the recipe here. To create something like this, we also need a good artist who can produce high-quality geometry, lighting, and, of course, good, life-like material models. For instance, without the materials part, we would see something like this. Not very exciting, right? Previously, we introduced a technique that learns our preferences and helps filling these scenes with materials. This work can also generate variants of the same materials as well.

### Neural Material Representation [0:53]

In a later technique, we could even take a sample image, completely destroy it in photoshop, and our neural networks would find a photorealistic material that matches these crazy ideas. Links to both of these works are available in the video description. And, to improve these digital materials, this new paper introduces something that the authors call a multi-resolution neural material representation. What is that? Well, it is something that is able to put amazingly complex material models in our light transport programs, and not only that, but…oh my.

### Neural Material Comparison [1:35]

Look at that! We can even zoom in so far that we see the snagged threads. That is the magic of the multi-resolution part of the technique. The neural part means that the technique looks at lots of measured material reflectance data, this is what describes a real-world material, and compresses this description down into a representation that is manageable. Okay…why? Well, look. Here is a reference material. You see, these are absolutely beautiful, no doubt, but are often prohibitively expensive to store directly. This new method introduces these neural materials to approximate the real world materials, but in a way that is super cheap to compute and store. So, our first question is, how do these neural materials compare to these real, reference materials? What do you think? How much worse a quality do we have to expect to be able to use these in our rendering systems? Well, you tell me, because you are already looking at the new technique right now. I quickly switched from the reference to the result with new method already. How cool is that? Look. This was the expensive reference material, and this is fast neural material counterpart

### Results [3:02]

for it. So, how hard is this to pull off? Well, let’s look at some more results side by side. Here is the reference. And here are two techniques from one and two years ago that try to approximate it. And you see that if we zoom in real close, these fine details are gone. Do we have to live with that? Or, maybe, can the new method do better? Hold on to your papers, and let’s see. Wow! While it is not a 100% perfect, there is absolutely no contest compared to the previous methods. It outperforms them handily in every single case of these complex materials I came across. And when I say complex materials, I really mean it. Look at how beautifully it captures not only the texture of this piece of embroidery, but, when we move the light source around, oh wow! Look at the area here around the vertical black stripe and how its specular reflections change with the lighting. And note that none of these are real images, all of them come from a computer program. This is truly something else. Loving it. So, if it really works so well, where is the catch? Does it work only on cloth-like materials? No-no, not in the slightest! It also works really well on rocks, insulation foam, even turtle shells and a variety of other materials. The paper contains a ton more examples than we can showcase here, so make sure to have a look in the video description. I guess this means that it requires a huge and expensive neural network to pull off, right? Well, let’s have a look. Whoa, now that’s something. It does not require a deep and heavy-duty neural network, just 4 layers are enough. And this, by today’s standard, is a lightweight network that can take these expensive reference materials and compress them down in a matter of milliseconds. And they almost look the same. Materials into our computer simulations straight from reality? Yes please! So, from now on, we will get cheaper and better material models for animation movies, computer games, and visualization applications! Sign me up right now! What a time to be alive! Thanks for watching and for your generous support, and I'll see you next time!
