Near-Perfect Virtual Hands For Virtual Reality! 👐
4:55

Near-Perfect Virtual Hands For Virtual Reality! 👐

Two Minute Papers 21.11.2020 399 945 просмотров 23 119 лайков

Machine-readable: Markdown · JSON API · Site index

Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI
Описание видео
❤️ Check out Lambda here and sign up for their GPU Cloud: https://lambdalabs.com/papers 📝 The paper "MEgATrack: Monochrome Egocentric Articulated Hand-Tracking for Virtual Reality" is available here: https://research.fb.com/publications/megatrack-monochrome-egocentric-articulated-hand-tracking-for-virtual-reality/ ❤️ Watch these videos in early access on our Patreon page or join us here on YouTube: - https://www.patreon.com/TwoMinutePapers - https://www.youtube.com/channel/UCbfYPyITQ-7l4upoX8nvctg/join 🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible: Aleksandr Mashrabov, Alex Haro, Alex Paden, Andrew Melnychuk, Angelos Evripiotis, Benji Rabhan, Bruno Mikuš, Bryan Learn, Christian Ahlin, Eric Haddad, Eric Lau, Eric Martel, Gordon Child, Haris Husic, Javier Bustamante, Joshua Goller, Lorin Atzberger, Lukas Biewald, Matthew Allen Fisher, Michael Albrecht, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Ramsey Elbasheer, Robin Graham, Steef, Taras Bobrovytsky, Thomas Krcmar, Torsten Reil, Tybie Fitzhugh. If you wish to support the series, click here: https://www.patreon.com/TwoMinutePapers Thumbnail background image credit: https://pixabay.com/images/id-4949333/ Károly Zsolnai-Fehér's links: Instagram: https://www.instagram.com/twominutepapers/ Twitter: https://twitter.com/twominutepapers Web: https://cg.tuwien.ac.at/~zsolnai/ #vr #metaverse

Оглавление (1 сегментов)

Segment 1 (00:00 - 04:00)

Dear Fellow Scholars, this is Two Minute  Papers with Dr. Károly Zsolnai-Fehér. The promise of virtual reality, VR is indeed  truly incredible. If one day it comes to fruition,   doctors could be trained to perform surgery in a  virtual environment, we could train better pilots   with better flight simulators, expose astronauts  to virtual zero-gravity simulations, you name it. An important part of doing many of these is  simulating walking in a virtual environment.    You see, we can be located in a small room,  put on a VR headset and enter a wonderful,   expansive virtual world. However, as we  start walking, we immediately experience   a big problem. What is that problem?   Well, we bump into things. As a remedy,   we could make our virtual world smaller,  but that would defeat the purpose.    This earlier technique addresses this  walking problem spectacularly by redirection. So, what is this redirection thing exactly?   Redirection is a simple concept that changes   our movement in the virtual world so it deviates  from our real path in the room in a way that both   lets us explore the virtual world, and not bump  into walls and objects in reality in the meantime.    Here you can see how the blue and orange lines  deviate, which means that the algorithm is at   work. With this, we can wander about in  a huge and majestic virtual landscape or   a cramped bar, even when being confined  to a small physical room. Loving the idea. But there is more to interacting with  virtual worlds than walking, for instance,   look at this tech demo that requires more  precise hand movements. How do we perform these?    Well, the key is here. Controllers!   Clearly, they work, but can we get   rid of them? Can we just opt for a more  natural solution and use our hands instead? Well, hold on to your papers, because this  new work uses a learning-based algorithm   to teach a head-mounted camera to tell the  orientation of our hands at all times. Of course,   the quality of the execution matters a great  deal, so we have to ensure at least three things. One is that the hand tracking  happens with minimal latency,   which means that we see our actions  immediately, with minimal delay. Two, we need low jitter, which means that  the keypoints of the reconstructed hand   should not change too much from frame to frame.   This happens a great deal with previous methods,   and what about the new one? Oh  yes, much smoother. Checkmark! Note that the new method also remembers  the history of the hand movement,   and therefore can deal with difficult occlusion  situations. For instance, look at the pinky here!    A previous technique would not know what’s going  on with it, but, this new one knows exactly what   is going on because it has information  on what the hand was doing a moment ago. And three, this needs to work in all kinds  of lighting conditions. Let’s see if it can   reconstruct a range of mythical creatures in  poor lighting conditions. Yes, these ducks   are reconstructed just as well as the mighty  pokemonster, and, these scissors too. Bravo! So, what can we do with this? A great deal. For  instance, we can type on a virtual keyboard,   or implement all kinds of virtual user  interfaces that we can interact with.    We can also organize imaginary boxes, and  of course, we can’t leave out the Two   Minute Papers favorite, going into a  physics simulation and playing with it. But of course, not everything  is perfect here, however. Look.    Hand-hand interactions don’t work so well, so  folks who prefer virtual reality applications   that include washing our hands should look  elsewhere. But of course, one step at a time. Thanks for watching and for your generous  support, and I'll see you next time!

Другие видео автора — Two Minute Papers

Ctrl+V

Экстракт Знаний в Telegram

Экстракты и дистилляты из лучших YouTube-каналов — сразу после публикации.

Подписаться

Дайджест Экстрактов

Лучшие методички за неделю — каждый понедельник