# Augmented Reality Presentations Are Coming!

## Метаданные

- **Канал:** Two Minute Papers
- **YouTube:** https://www.youtube.com/watch?v=wVtOuvFlczg
- **Дата:** 17.08.2019
- **Длительность:** 1:35
- **Просмотры:** 72,340

## Описание

📝 The paper "Interactive Body-Driven Graphics for Augmented Video Performance" is available here:
https://1iyiwei.github.io/ibg-chi19/
https://hal.archives-ouvertes.fr/hal-02005318/document

❤️ Pick up cool perks on our Patreon page: https://www.patreon.com/TwoMinutePapers

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
313V, Alex Haro, Andrew Melnychuk, Angelos Evripiotis, Anthony Vdovitchenko, Brian Gilman, Bruno Brito, Bryan Learn, Christian Ahlin, Christoph Jadanowski, Claudio Fernandes, Daniel Hasegan, Dennis Abts, Eric Haddad, Eric Martel, Evan Breznyik, Geronimo Moralez, James Watt, Javier Bustamante, John De Witt, Kaiesh Vohra, Kasia Hayden, Kjartan Olason, Levente Szabo, Lorin Atzberger, Lukas Biewald, Marcin Dukaczewski, Marten Rauschenberg, Maurits van Mastrigt, Michael Albrecht, Michael Jensen, Nader Shakerin, Owen Campbell-Moore, Owen Skarpness, Raul Araújo da Silva, Rob Rowe, Robin Graham, Ryan Monsurate, Shawn Azman, Steef, Steve Messina, Sunil Kim, Taras Bobrovytsky, Thomas Krcmar, Torsten Reil, Zach Boldyga.
https://www.patreon.com/TwoMinutePapers

Splash screen/thumbnail design: Felícia Fehér - http://felicia.hu

Károly Zsolnai-Fehér's links:
Instagram: https://www.instagram.com/twominutepapers/
Twitter: https://twitter.com/karoly_zsolnai
Web: https://cg.tuwien.ac.at/~zsolnai/

#ar #metaverse

## Содержание

### [0:00](https://www.youtube.com/watch?v=wVtOuvFlczg) Segment 1 (00:00 - 01:00)

Dear Fellow Scholars, this is Two Minute Papers with Károly Zsolnai-Fehér. In this series, we talk about amazing research papers. However, when a paper is published, also, a talk often has to be given at a conference. And this paper is about the talk itself, or more precisely, how to enhance your presentation with dynamic graphics. Now, these effects can be added to music videos and documentary movies, however, they take a long time and cost a fortune. But not these ones, because this paper proposes a simple framework in which the presenter stands before a kinect camera and an AR mirror monitor, and can trigger these cool little graphical elements with simple gestures. A key part of the paper is the description of a user interface where we can design these mappings. This skeleton represents the presenter who is tracked by the kinect camera, and as you see here, we can define interactions between these elements and the presenter, such as grabbing this umbrella, pull up a chart, and more. As you see with the examples here, using such a system leads to more immersive storytelling, and note that again, this is an early implementation of this really cool idea. A few more papers down the line, I can imagine rotatable and deformable 3D models and photorealistic rendering entering the scene…well, sign me up for that. If you have any creative ideas as to how this could be used or improved, make sure to leave a comment. In the meantime, we are also now available on Instagram, so if you wish to see cool little snippets of our latest episodes, including this one, make sure to check us out there. Thanks for watching and for your generous support, and I'll see you next time!

---
*Источник: https://ekstraktznaniy.ru/video/14267*