# Differentiable Material Synthesis Is Amazing! ☀️

## Метаданные

- **Канал:** Two Minute Papers
- **YouTube:** https://www.youtube.com/watch?v=9kllWAX9tHw
- **Дата:** 02.03.2021
- **Длительность:** 9:33
- **Просмотры:** 89,207

## Описание

❤️ Check out Perceptilabs and sign up for a free demo here: https://www.perceptilabs.com/papers

📝 The paper "MATch: Differentiable Material Graphs for Procedural Material Capture" is available here:
http://match.csail.mit.edu/

📝 Our Photorealistic Material Editing paper is available here:
https://users.cg.tuwien.ac.at/zsolnai/gfx/photorealistic-material-editing/

☀️ The free course on writing light simulations is available here:
https://users.cg.tuwien.ac.at/zsolnai/gfx/rendering-course/

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Haro, Alex Serban, Alex Paden, Andrew Melnychuk, Angelos Evripiotis, Benji Rabhan, Bruno Mikuš, Bryan Learn, Christian Ahlin, Eric Haddad, Eric Martel, Gordon Child, Haris Husic, Jace O'Brien, Javier Bustamante, Joshua Goller, Kenneth Davis, Lorin Atzberger, Lukas Biewald, Matthew Allen Fisher, Mark Oates, Michael Albrecht, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Ramsey Elbasheer, Robin Graham, Steef, Taras Bobrovytsky, Thomas Krcmar, Torsten Reil, Tybie Fitzhugh.
If you wish to appear here or pick up other perks, click here: https://www.patreon.com/TwoMinutePapers

Thumbnail background image: https://pixabay.com/images/id-4238615/

Károly Zsolnai-Fehér's links:
Instagram: https://www.instagram.com/twominutepapers/
Twitter: https://twitter.com/twominutepapers
Web: https://cg.tuwien.ac.at/~zsolnai/

## Содержание

### [0:00](https://www.youtube.com/watch?v=9kllWAX9tHw) <Untitled Chapter 1>

Dear Fellow Scholars, this is Two Minute  Papers with Dr. Károly Zsolnai-Fehér. I am a light transport simulation researcher  by trade, and I am very happy today, because we   have an absolutely amazing light transport paper  we’re going to enjoy today. As many of you know,   we write these programs that you can run on your  computer to simulate millions and millions of   light rays, and calculate how they get absorbed or  scattered off of our objects in a virtual scene.    Initially, we start out with a really noisy image,  and as we add more rays, the image gets clearer   and clearer over time. We can also simulate  sophisticated material models in these programs.    A modern way of doing that is  through using these material nodes.

### [0:42](https://www.youtube.com/watch?v=9kllWAX9tHw&t=42s) Material Nodes

With these, we can conjure up a ton of  different material models and change   their physical properties to our liking. As you  see, they are very expressive indeed, however,   the more nodes we use, the less clear it becomes  how they interact with each other. And as you see,   every time we change something, we have to wait  until a new image is rendered. That is very time   consuming, and more importantly, we have to have  some material editing expertise to use this. This concept is very powerful, for instance, I  think if you watch the Perceptilabs sponsorship   spot at the end of this video, you will be very  surprised to see that they also use node groups,   but with theirs, you don’t build material  models, you can build machine learning models. What would be really cool if we could just give  the machine a photo and it would figure out how   to set these nodes up so it looks exactly like  the material in the photo. So, is that possible,   or is that science fiction? Well, have a look our  paper, called Photorealistic Material Editing.

### [1:50](https://www.youtube.com/watch?v=9kllWAX9tHw&t=110s) Photorealistic Material Editing

With this technique, we can easily create these  beautiful material models in a matter of seconds,   even if we don’t know a thing  about light transport simulations.    It does something that is similar to  what many call differentiable rendering. Here is the workflow, we give it a bunch  of images like these, which were created on   this particular test scene, and it guesses what  parameters to use to get these material models.    Now, of course, this doesn’t make any  sense whatsoever, because we have produced   these images ourselves, so we know exactly what  parameters to use to produce this. In other words,   this thing seems useless. And now comes the magic  part, because we don’t use these images. No-no! We load them into Photoshop, and edit them to our  liking, and just pretend that these images were   created with the light simulation program. This  means that we can create a lot of quickly, really   poorly executed edits. For instance, the stitched  specular highlight in the first example isn’t very   well done, and neither is the background of the  gold target image in the middle. However, the   key observation is that we built a mathematical  framework, which makes this pretending really   work! Look, in the next step, our method proceeds  to find a photorealistic material description   that, when rendered, resembles this target  image, and works well even in the presence   of these poorly executed edits. So these  materials are completely made up in Photoshop,   and it turns out, we can create photorealistic  materials through these node graphs that look   almost exactly the same. Quite remarkable.   The whole process executes in 20 seconds. If you are one of the more  curious Fellow Scholars out there,   this paper and its source code are  available in the video description. Now, this differentiable thing has a lot  of steam. For instance, there are more   works on differentiable rendering. In this  other work, we can take a photo of a scene,   and a learning-based method turns the  knobs until it finds a digital object   that matches its geometry and material  properties. This was a stunning piece of work,   from Wenzel Jakob and his group, of course, who  else. They are some of the best in the business. And we don’t even need to be in the area of light  transport simulations to enjoy the benefits of   differentiable formulations, for instance, this  is differentiable physics. So what is that?

### [4:27](https://www.youtube.com/watch?v=9kllWAX9tHw&t=267s) Differentiable Physics

Imagine that we have this billiard game, where  we would like to hit the white ball with just   the right amount of force and from the right  direction, such that the blue ball ends up close   to the black spot. Well, this example shows that  this is unlikely to happen by chance, and we have   to engage in a fair amount of trial and error  to make this happen. What this differentiable   programming system does for us is that we can  specify an end state, which is the blue ball   on the black dot, and it is able to compute the  required forces and angles to make this happen.    Very close. So after you look here,   maybe you can now guess what’s next  for this differentiable technique…it   starts out with a piece of simulated ink  with a checkerboard pattern, and it exerts   just the appropriate forces so that it forms  exactly the Yin-Yang symbol shortly after. And now that we understand what  differentiable techniques are capable of,   we are ready to proceed to today’s paper. This is  a proper, fully differentiable material capture

### [5:36](https://www.youtube.com/watch?v=9kllWAX9tHw&t=336s) Differentiable Material Capture Technique for Real Photographs

technique for real photographs. All this needs  is one flash photograph of a real-world material.    We have those around us in abundance, and  similarly to our previous method, it sets up   the material nodes for it. That is a good thing,  because I don’t know about you, but I do not want   to touch this mess at all. Luckily, we don’t have  to, look! The left is the target photo, and the   right is the initial guess of the algorithm,  that is not bad, but also not very close.    And now, hold on to your papers and just look at  how it proceeds to refine this material until it   closely matches the target. And with that, we have  a digital representation of these materials. We   can now easily build a library of these materials  and assign them to the objects in our scene.    And then, we run the light simulation  program, and here we go. Beautiful. At this point, if we feel adventurous, we can  adjust small things in the material graphs to   create a digital material that is more in line  with our artistic vision. That is great, because   it is must easier to adjust an already existing  material model than creating one from scratch.

### [6:55](https://www.youtube.com/watch?v=9kllWAX9tHw&t=415s) Key Differences

So what are the key differences  between our work from last year,   and this new paper? Our work made a rough initial  guess and optimized the parameters afterwards,   it was also chock full of neural networks,  it also created materials from a sample,   but that sample was not a photograph, but  a photoshopped image. That is really cool,   however, this new method takes an almost  arbitrary photo, many of these we can take   ourselves or even get them from the internet,  therefore this new method is more general. It also supports 131 different  material node types, which is insanity.    Huge congratulations to the  authors, if I would be an artist,   I would want to work with this right  about now. What a time to be alive! So there you go, this was quite a ride, and I  hope you enjoyed it just half as much as I did.    And if you enjoyed it at least as much as I did,  and you feel a little stranded at home and are   thinking that this light transport thing is pretty  cool, and you would like to learn more about it,   I held a Master-level course on this topic  at the Technical University of Vienna.    Since I was always teaching it to a handful of  motivated students, I thought that the teachings   shouldn’t only be available for the privileged  few who can afford a college education, but   the teachings should be available for everyone.   Free education for everyone, that’s what I want.    So, the course is available free of charge  for everyone, no strings attached, so make   sure to click the link in the video description  to get started. We write a full light simulation   program from scratch there, and learn about  physics, the world around us, and more. Thanks for watching and for your generous  support, and I'll see you next time!

---
*Источник: https://ekstraktznaniy.ru/video/13966*