# NVIDIA’s New AI Draws Images With The Speed of Thought! ⚡

## Метаданные

- **Канал:** Two Minute Papers
- **YouTube:** https://www.youtube.com/watch?v=Wbid5rvCGos
- **Дата:** 20.01.2022
- **Длительность:** 5:54
- **Просмотры:** 400,124

## Описание

❤️ Check out Cohere and sign up for free today: https://cohere.ai/papers

Online demo - http://gaugan.org/gaugan2/
NVIDIA Canvas - https://www.nvidia.com/en-us/studio/canvas/

📝 The previous paper "Semantic Image Synthesis with Spatially-Adaptive Normalization" is available here:
https://nvlabs.github.io/SPADE/

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Angelos Evripiotis, Benji Rabhan, Bryan Learn, Christian Ahlin, Eric Martel, Gordon Child, Ivo Galic, Jace O'Brien, Javier Bustamante, John Le, Jonas, Kenneth Davis, Klaus Busse, Lorin Atzberger, Lukas Biewald, Matthew Allen Fisher, Mark Oates, Michael Albrecht, Michael Tedder, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Peter Edwards, Rajarshi Nigam, Ramsey Elbasheer, Steef, Taras Bobrovytsky, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.
If you wish to appear here or pick up other perks, click here: https://www.patreon.com/TwoMinutePapers

Thumbnail background design: Felícia Zsolnai-Fehér - http://felicia.hu

Wish to watch these videos in early access? Join us here: https://www.youtube.com/channel/UCbfYPyITQ-7l4upoX8nvctg/join

Meet and discuss your ideas with other Fellow Scholars on the Two Minute Papers Discord: https://discordapp.com/invite/hbcTJu2

Károly Zsolnai-Fehér's links:
Instagram: https://www.instagram.com/twominutepapers/
Twitter: https://twitter.com/twominutepapers
Web: https://cg.tuwien.ac.at/~zsolnai/

#nvidia #gaugan #gaugan2

## Содержание

### [0:00](https://www.youtube.com/watch?v=Wbid5rvCGos) Segment 1 (00:00 - 05:00)

Dear Fellow Scholars, this is Two Minute  Papers with Dr. Károly Zsolnai-Fehér. Welcome to episode 600! And today you will see  your own rough drawings come to life as beautiful   photorealistic images. And it turns out, you can  try it too. I’ll tell you about it in a moment. This technique is called GauGAN2, and yes, this  is really happening. In goes a rough drawing,   and out comes an image of this quality. That  is incredible. But here, there is something   that is even more incredible. What is it? Well,  drawing is an iterative process. But, once we are   committed to an idea, we need to refine it over  and over, which takes quite a bit of time, and   let’s be honest here, sometimes, things come out  differently than we may have imagined. But this,   this is different. Here, you can change things  as quickly as you can think of the change. You   can even request a bunch of variations on  the same theme, and get them right away. But that’s not all, not even close. Get this, with  this, you can draw, even without drawing. Yes,   really. How is that even possible?   Well, if we don’t feel like drawing,   we can just type what we wish to see,  and…my goodness, it not only generates   these images according to the written description,  but this description can get pretty elaborate. For instance, we can get ocean waves, that’s  great, but now, let’s add some rocks…and   a beach too. And there we go! We can also use an image as a starting point,  then, just delete the undesirable parts,   and have it inpainted by the algorithm. Now,  okay this is nothing new, computer graphics   researchers were able to do this for more  than 10 years now. But hold on to your papers,   because they couldn’t do this. We can fill  in these gaps with a written description.    Couldn’t witness the northern lights  in person? No worries, here you go. And, wait a second…did you see that? There  are two cool really cool things to see here.    Thing number one, it even redraws  the reflections on the water,   even if we haven’t highlighted that part for  inpainting. We don’t need to say anything,   it will update the whole environment to reflect  the new changes by itself. That is amazing.    Now, I am a light transport researcher by  trade, and this makes me very, very happy. Thing number two, I don’t know if you caught this,   but this is so fast, it doesn’t  even wait for your full request,   it updates after every single keystroke. Look.   Drawing is an inherently iterative process,   and iterating with this is an absolute  breeze. Not will be a breeze. It is a breeze. Now, after nearly every Two Minute Papers  episode where we showcase an amazing paper,   I get a question saying something like  “okay, but when do I get to see or use this   in the real world? ”. And rightfully so, that  is a good question. The previous GauGAN paper   was published in 2019, and here we are, just  a bit more than 2 years later, and it has been   transferred into a real product. Not only that,  but the resolution has improved a great deal,   about 4 times of what it was before, plus  the new version also supports more materials. And we are at the point where this  is finally not just a cool tech demo,   but a tool that is useful for real  artists. What a time to be alive! Now, I noted that earlier, I did not say that  iterating this will be a breeze. But it is a   breeze. Why? Well, great news, because you  can try it right now in two different ways. One, it is now part of as  desktop application called   NVIDIA Canvas. With this, you can even export the  layers to Photoshop and continue your work there.    This will require a relatively  recent NVIDIA graphics card. And two, there is a web app too that you  can try right now! The link is available   in the video description, and if  you try it, please scroll down,   and make sure to read the instructions and  watch the tutorial video to not get lost. And remember, all this tech transfer from  paper to product took place in a matter   of two years. Bravo NVIDIA! The pace of  progress in AI research is absolutely amazing.

### [5:00](https://www.youtube.com/watch?v=Wbid5rvCGos&t=300s) Segment 2 (05:00 - 05:00)

Thanks for watching and for your generous  support, and I'll see you next time!

---
*Источник: https://ekstraktznaniy.ru/video/13692*