# Unreal Engine 5: Next Level Games Are Coming!

## Метаданные

- **Канал:** Two Minute Papers
- **YouTube:** https://www.youtube.com/watch?v=dVgx3uJuHOE
- **Дата:** 31.05.2023
- **Длительность:** 5:22
- **Просмотры:** 208,916
- **Источник:** https://ekstraktznaniy.ru/video/13151

## Описание

❤️ Check out Weights & Biases and sign up for a free demo here: https://wandb.com/papers 
❤️ Get about $50 off from an upcoming W&B event in San Francisco! - https://shorturl.at/brtIQ

My latest paper on simulations that look almost like reality is available for free here:
https://rdcu.be/cWPfD 

Or this is the orig. Nature Physics link with clickable citations:
https://www.nature.com/articles/s41567-022-01788-5

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Benji Rabhan, Bryan Learn, B Shang, Christian Ahlin, Eric Martel, Geronimo Moralez, Gordon Child, Jace O'Brien, Jack Lukic, John Le, Jonathan, Kenneth Davis, Klaus Busse, Kyle Davis, Lorin Atzberger, Lukas Biewald, Martin, Matthew Valle, Michael Albrecht, Michael Tedder, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Rajarshi Nigam, Ramsey Elbasheer, Richard Sundvall, Steef, Taras Bobrovytsky, Ted Johnson, Thomas K

## Транскрипт

### Intro []

Dear Fellow Scholars, this is Two Minute  Papers with Dr. Károly Zsolnai-Fehér. Today we are going to talk about MetaHumans.   What are those? Well, see here for yourself. This will be a culmination of a ton of  papers that we talk about here. For instance,   the hair has to be simulated, often strand by  strand. The appearance of skin has to be correct,   and deformations to the skin also have to  be simulated correctly. And here comes the   most challenging part: all this on  a shoestring computational budget,   or in other words, yes, in real time. Is  this even possible? Well, let’s see together! Yes, this is Metahuman, part of Unreal Engine  5, a popular game development platform where   we can create virtual humans like we would do in  a video game, with the difference that these are   incredibly high quality models. Today, millions  of these Metahumans already exist. But that is

### Metahuman [1:00]

nothing compared to what you are about to see  now. Now they have introduced Metahuman Animator,   where we can scan ourselves, and enter a virtual  world. So they say, so let’s have a look together!    First, we capture ourselves, and by that I  mean our appearance and gestures too. Then,   after less than a minute of processing, now it  is able to locate and track facial landmarks,   like the eyes, eyebrows and mouth, and now, it  can show us the DNA of our virtual character.    This can mimic our gestures really well already.   But I hear you saying, okay Doc, but it doesn’t   look like a test subject at all. So now, hold on  to your papers Fellow Scholars, and watch this.    Wow, now we’re talking! That is simply  incredible. This did not require the   expertise or a group of artists, a huge studio  with a huge budget, and it did not even require   months of work. It is done right as we’re  talking in minutes. What a time to be alive! And if that is at all possible, it gets even  better! We can also transfer these gestures   to a variety of different characters.   That is excellent in and of itself,   but just imagine how we could make  completely virtual characters and   life forms come to life with our  own gestures. How cool is that? And here are a few more features that  I absolutely loved. We can design the   full body of our characters, we  can even choose clothes for them,   and if we are not happy with our character,  we can even make small adjustments to it. Now, after creating or scanning these characters,  they are now allowed to enter a virtual world.

### Movement [3:00]

However, when they move around in this  world, their movement and as a result,   deformations of their muscles have to  be simulated. This can be done with a coarse   simulator that runs in real time, however,  those are not nearly as good as these ones.    Now these are absolutely amazing. However,  these simulations take forever. And I really   mean forever - these simulations are measured not  in frames per second, but in seconds per frame,   and it gets even worse, sometimes even minutes  per frame. This is the price to be paid for these   super accurate simulations. Until now. You see,  these simulations can also be done super quickly   with learning-based methods, and now, these also  run in real time. Just look at how realistic this

### Garment Simulation [4:00]

movement is. And this will run on our computers,  and later, even on our phones. And get this,   this also applies for garment simulations as  well. I have highlighted the relevant regions   where the new, neural network-based technique  generates these really cool folds on the cloth.    Wow. As you see, we are living incredible  times, and this will put gaming,   animation movies, virtual worlds, perhaps  even conference calls to the next level.

### Outro [5:00]

Thanks for watching and for your generous  support, and I'll see you next time!
