❤️ Check out Lambda here and sign up for their GPU Cloud: https://lambda.ai/papers
📝 The Unreal Engine 5.7 is available here:
https://www.unrealengine.com/en-US/news/unreal-engine-5-7-is-now-available
Sources:
https://www.youtube.com/watch?v=Mj_-2SdsYLw
https://www.youtube.com/watch?v=ngzPTqtZWo4
https://advances.realtimerendering.com/s2023/2023%20Siggraph%20-%20Substrate.pdf
📝 My paper on simulations that look almost like reality is available for free here:
https://rdcu.be/cWPfD
Or this is the orig. Nature Physics link with clickable citations:
https://www.nature.com/articles/s41567-022-01788-5
🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Benji Rabhan, B Shang, Christian Ahlin, Gordon Child, Juan Benet, Michael Tedder, Owen Skarpness, Richard Sundvall, Steef, Taras Bobrovytsky, Tybie Fitzhugh, Ueli Gallizzi
If you wish to appear here or pick up other perks, click here: https://www.patreon.com/TwoMinutePapers
My research: https://cg.tuwien.ac.at/~zsolnai/
X/Twitter: https://twitter.com/twominutepapers
Thumbnail design: Felícia Zsolnai-Fehér - http://felicia.hu
Оглавление (2 сегментов)
Segment 1 (00:00 - 05:00)
Amazing day, Unreal Engine 5. 7 is officially finally here. It is an all in one solution for creating animations, virtual characters, video games, movies, you name it. This is a blockbuster release, and we will look at 3 incredible things it can do. And before we start, don’t forget - all this is available for free for most people and applications. One, let’s start with my favorite! This is Substrate, a material creation system, and the realism it can bring to our screens is next level. Why? Well, first, unreal can run a light simulation, with millions and millions of light rays hitting this object, and Substrate tell us how the material should respond to that. That already sounds like science fiction, but here is where it gets better: you can set up this imaginary object with whatever material properties you can imagine. And even better: you can do multiple layers of this. That is where realism gets to the next level. With Substrate you can define a piece of metal as the core of the object, and put a smooth colored coat over it. And then, simulate how light is bouncing between these layers. Incredible. For the longest time, this sounded like a fun little science experiment, lots of research papers that I absolutely love, but let’s be honest - it was still was not practical enough for the industry. And get this, Substrate is now finally production ready. I am a light transport researcher by trade and I spent so many years learning about these multi-layer simulators, and to see the papers finally come alive for millions of people, for free. I am floored. Loving this. Note that we are not affiliated with them in any way. But two, all this light simulation isn’t worth anything if the underlying geometry is not detailed enough. I don’t just want a bunch of spheres, I want a fully detailed forest with all kinds of plants everywhere. That’s what the new Nanite Foliage helps with. This can render millions of tiny little elements on your screen, but it also does something we call level of detail really well. Level of detail means the engine quietly swaps between simpler and more complex versions of the same object depending on how far you are from it. If you are far away, just render a few triangles, you don’t need so much detail anyway. And whenever we go closer, it suddenly swaps in the more detailed geometry. Okay, but this concept has a flaw. It is clear as day to all of you Fellow Scholars that we have these popping artifacts here when the swaps happen. Now, hold on to your papers Fellow Scholars, and have a look at the new system, so, did you see it? Did you see a pop? Not a chance, right! And that’s the point. Under the hood, there is lots of magic to help your machine calculate as little as possible, and it swaps the geometry so seamlessly, you don’t even see it working. And yet, it saves you tons and tons of resources. Brilliant work! But we are not done yet. Not even close! So, we have good light transport, and good geometry. But that is still not nearly enough to get a good looking scene. What about light sources? Well, there is a nice surprise there too. Dear Fellow Scholars, this is Two Minute Papers with Dr. Károly Zsolnai-Fehér. Dr. Carroll. Light sources is where everything starts after all, is it not? Now, three, this is MegaLights, this is a system that gives you hundreds and hundreds of these beautiful lights in a black market scene, each of which are casting their own shadows. And all this in real time. And we are talking proper soft shadows. And it just got better, it can give you directional lights, shadow-casting particles, and shadowing on hair. Also, higher visual quality, better performance, and less noise. This is incredibly difficult, because during a light simulation, each of your rays has to find a light source. But most rays don’t do that, it’s like throwing thousands and thousands of darts into the darkness, hoping to find a small glowing target. There are so many ray tracing papers on how to do this more efficiently, and you see a masterclass of that in Unreal Engine. And with megalights finally moving from experimental to beta this
Segment 2 (05:00 - 07:00)
release. Much more stable than before, still not final. But I am so happy to see that the papers are coming alive and are put into the hands of millions of us Fellow Scholars. So cool! But we are not done yet. This was a Scholarly look at three core technologies, but these 3 amazing other features also really caught my eye. Let’s have a quick look. One, amazing update to Metahuman. Metahuman is a realistic person creator, if you will. This is a culmination of a ton of papers that we talk about here. For instance, the hair has to be simulated correctly, often strand by strand. The appearance of skin has to be correct, and deformations to the skin also have to be simulated correctly. And earlier, they showcased Metahuman Animator, where we can scan ourselves, and enter a virtual world as a video game character. And this character is able to mimic our gestures too. Now, I was thinking that all sounds fantastic, but now, put your mouth where your papers are! Let us do it in real time. That is the real test. And, here it is. I can’t believe it. Live Link Face lets you capture your facial expressions in real time through a camera and put it on the video game character. Two, you now have better ways to get a great virtual haircut, you just pull these sliders around and you can assemble a completely new one. And, you can also animate it as if it were the body of a character with these little imagined joints. Really cool idea. Three, more realistic physics interactions for characters, so you can feel like a real computer graphics researcher and do these…perturbation tests that we love to do. Once again, just thinking that all this is free for most people and projects. Goodness. It is all available through the link in the video description. Check it out, and let me know in the comments what crazy projects you use Unreal Engine for. I’d love to hear that. And make sure to check out Lambda too, they are amazing and they make Two Minute Papers possible.