# SORA Demo FAKED? Elon Musk’s 18 billion. New AI Characters, New Humanoid Robot

## Метаданные

- **Канал:** TheAIGRID
- **YouTube:** https://www.youtube.com/watch?v=yG4dUQk0SKM
- **Дата:** 29.04.2024
- **Длительность:** 19:50
- **Просмотры:** 80,276
- **Источник:** https://ekstraktznaniy.ru/video/14361

## Описание

Link to the website:
https://www.talkie-ai.com/

Discord: https://discord.com/invite/talkieai

TikTok: https://www.tiktok.com/@talkiedoki

Reddit: https://www.reddit.com/r/TalkieOfficial/

Instagram: https://www.instagram.com/talkie_app/

X/Twitter: https://twitter.com/Talkie_APP

Talkie #TalkieAI
How To Not Be Replaced By AGI https://youtu.be/AiDR2aMye5M
Stay Up To Date With AI Job Market - https://www.youtube.com/@UCSPkiRjFYpz-8DY-aF_1wRg 
AI Tutorials - https://www.youtube.com/@TheAIGRIDAcademy/ 

🐤 Follow Me on Twitter https://twitter.com/TheAiGrid
🌐 Checkout My website - https://theaigrid.com/

Links From Todays Video:
https://twitter.com/gdb/status/1783234941842518414/photo/1 
https://finance.yahoo.com/news/jensen-huang-elon-musk-openai-182851783.html 
https://twitter.com/perplexity_ai/status/1783559575972524389 
https://twitter.com/DylanIskandar0/status/1783342877722210473 
https://twitter.com/synthesiaIO/status/1783535861214204368 
https://www.thebaltimorebanner.com/education/k

## Транскрипт

### Intro []

so with another rather fascinating day in artificial intelligence let's take a look at some of the most interesting stories that you probably want to know so one of the stories that I actually picked up was one that was about Sora so if you don't know Sora is open ai's video generation tool and it's actually pretty interesting what people found now I don't think this is a big deal in terms of like how bad it is but some people think that this is basically a crazy deal so essentially they basically did the behind the scenes of the Sora clip so if you haven't watched it basically there was a SORA clip which was called ligh Head okay or balloon head or whatever it was called and in that video it looked really good but the main issue that people are having is the fact that the video was edited a little bit more and the clip wasn't entirely AI generated there was some rotoscoping and there was some manual editing and for some reason people are having a go at this and they're basically saying that the AI just isn't good enough now I'm going to include the video so you know what the video is you can see in the trailer here that it's about a guy who has a head which is just entirely a balloon and it's an interesting concept that we wouldn't really be able to film if we didn't have ai generated tools where we could literally generate stuff like this on a whim now the article in question essentially just breaks down the behind the scenes and the funny thing is that the actual article isn't like a dig at open eye or Sora it's literally just a behind the scenes thing where you can understand how it works it says that the user interface allows artist to input a text prompt open your eyes chat GPT then converts it into this longer string which triggers the clip generation and at this moment there is no other input so you can't even put images or references stuff like that and they actually explain here that explaining wardrobe for characters as well as the type of balloon was our way around consistency because short to short generation there isn't a feature yet for full character consistency now this is where like many people and millions of people were seting that oh Sora is so bad it's so awful it's because initially when you generated some of the clips there would be this face which you can all see does have some resemblance to I guess you could say human or some weird plastic creature that of course wouldn't be good to have in the final clip which is why a lot of people are basically stating that you know uh this is generative Ai and this kind of technology is awful now you can also see here that this was a clip that was also generated and with this one what we also had was you can see that the color has changed a little bit and of course there was a head here that was had that basically had to be rotoscoped out and just brushed out of the scene so you can see right here that it says Roto while all imagery was generated in Sora the balloon still required a lot of postwork in addition to isolating the balloon so that it could be recolored it would sometimes have a face on sonny as if his face was drawn on with a marker then this would be removed in After Effects and of course there were some other artifacts now I don't think that this is the most interesting thing A lot of people are taking this and stating that look oh my God Sora is awful because it generates inconsistencies there were you know these mistakes in there but I don't understand why people are stating that when literally open AI themselves stated that Sora isn't perfect and Sora does have its own issues the thing that I think most people should be paying attention to is the details around Sora because this is a highly exclusive technology that we haven't even had access to and stuff that we've never even been able to look at yet in terms of a screenshot for the user interface so the thing that I wanted to you know note down here was of course the render time you can see here that it says clicks can be rendered in a varying segments of time such as 3 seconds 5 Seconds 10 seconds 20 seconds up to a minute and render times vary depending on the day and demand for cloud usage says generally you're looking at about 10 to 20 minutes per render and from my experience that duration I choose to render has a small effect on the render time if it's 3 to 20 seconds the render time tends not to vary too much from between a 10 to 20 minute range and we would generally do that because if you get the full 20 seconds you hope you have more opportunities to slice and to edit stuff and to increase your chances of something that looks good so this is actually pretty interesting because this actually gives us some more details on how long it takes to generate a clip with Sora I think 10 to 20 minutes is I wouldn't say it's insane but something that we now know is that this is pretty clearly far away from you know I guess you could scale you could say wides scale usage because of course I'm guessing the inference cost for this must be truly incredible because I mean 10 to 20 minutes for a render is pretty crazy but something that I will say okay someone that you know has done work in that industry before I will say that whilst yes 10 to 20 minutes does seem crazy because when we enter text right now we get text straight away and it's pretty instant and for an image we wait like literally 10 seconds and the diffusion models just literally generate the images within seconds when people are thinking about a video it's like whoa why is a video 10 to 20 minutes long I've got to be honest guys if you have anything um and you know a lot about V effects you'll know that visual effects is so timec consuming is something that you have to get right and rendering stuff if we're able like with s with Sora 5 or maybe Sora 4 or whatever version we're able to get stuff that's really accurate with character consistency it's going to change the game especially if the render times are lower because render times are absolutely insane and companies shell out big bucks to render Farms where they have stuff rendering all the time in order to get their products done within the time and a lot of times some movies don't even get done because the VFX artists are literally breaking the necks to try and get everything done on time so I think this right here could be a complete Game Changer if uh there's more efficient architecture or if like I said before they scale up their compute to have that and that was of course something that they did actually talk about so I wouldn't be surprised if they did that now here we have Elon Musk

### Elon Musk Raises 6 Billion [5:42]

close to raising $6 million from Sequoia and others now I saw some tweets where people were surprised about this and I don't think this is surprising at all because like I said already on Twitter this is where you are raising money based on future valuations of course the company right now is I guess you know we can argue semantics here but the point is that yes the company right now is Raising $6 billion at an valuation of $18 billion but I think most people who are investing in that are betting on the next 10 years what El musk is going to do because he does have a track record of actually completing the stuff that he does and of course once again that can be argued but I think we all know that AI is going to be the most transformative technology and if musk sets his mind to something it's pretty rare that he doesn't succeed of course there are delays in some technologies do require safety requirements and all that stuff so I always think delays are necessary but an $18 billion valuation I don't think that is that crazy considering the fact that is Elon Musk as well it's not like it's some random startup started by a different group of people that even maybe have worked at the top AI Labs but Elon Musk is a household name you have to understand that if he takes this stock public okay let's say he was about to okay he has huge huge influence okay which means that he can command a huge valuation now um this is of course for x. a and I think x. a is definitely going to be working closely with Tesla with some of their AI Integrations and I think the scale of money and scale of compute that you do need to compete with the big players just goes to show because if he's raising $6 billion of course he does have billions of dollars you know with his other companies but those are also burning millions and billions of dollars too so he's going to need some clear investment here because like I said meta is not playing any games meta have honestly ordered so much gpus Google's compute capacity is the biggest at the moment they have the most compute out of any of the top companies and openi are partnering with Microsoft on 100 billion supercomputer that's being built as we speak so I think if you wants to have any chance at competing AI this is clearly the way to go and a lot of people are stating that oh AI is in a bubble look at this insane valuation I don't think so okay uh I think that this is actually where the valuation lies in the sense that um so far what they've actually done they've done really decent on the benchmarks and to be honest with you guys they haven't slacked off either they are a small team compared to the others and they've moved pretty quickly but uh this is of course something that we talked about before but it seems that the deal is probably going to be finalized very soon so I know a lot of people are you know giving El Flack for this but uh it's not something that I think is too crazy speaking of chatbots this actually

### Talkie AI [8:27]

brings me to today sponsor which is talkie AI now this is a rather interesting website where you can actually talk to AI avatars that are many different characters now not only is this actually a web app you actually do have IOS and Android applications that makes this really cool now there are actually some unique features about this that make it very cool to use the first one is that it is completely free to use there's also a nice voice and there are many different characters that you can talk to so you pretty much never run out of people to talk to now you can see right here that what we are currently on is we are on the Discover page and this is where you can discover any character that you want to talk to now I'm someone who I guess you could say I'm a little bit weird so I actually spent a lot of my time talking to intima objects like cheese and of course intimate objects like the universe hello I am the universe ask any question now there's also a search page and this is actually where I found some of the more interesting characters now in my free time I actually do play Call of Duty so I did actually scroll down to some of these characters and we've been going back and forth about why I'm actually terrible at the game and one of the coolest things you can actually do with these characters is that they actually do have custom voices so it doesn't just feel like you're talking to a normal chatbot it actually feels real person where is Captain Price you can see he just asked me where Captain Price is and honestly I don't know don't lie to me and then of course what's actually cool about this is that if you don't like the character's a response you can actually change this and you can pick a response that you like in order to make the conversation go a completely different way so you can see here you can change it to pretty much whatever you want to do and this is me talking to an OverWatch character that I'm guessing some of you probably know who it is so that being said I'm going to continue talking to strange different characters and some an atal objects and let me know if you find this interesting so actually

### AI Avatars [10:13]

we have something that I've been wanting for ages and finally it's finally here um basically they say that AI avatars aren't great actors but they don't understand what they're saying or how they should say it basically there's now Express one where they've trained AI avatars to perform with a tone of voice and facial expressions and Body Language so take a look at this trailer cuz I actually do have a lot to say because I think this is important in terms of for business use of course not for like deep fix and stuff like that but um I think this is actually important because it brings us to another level of realism hello everyone I'm very happy to showcasing new features and improvements from the research team for the Epic Synthesia V4 Avatar release now focus on my lips they're snappier and align more precisely with each word I say to you and my voice do you hear the difference I sound more like an engaging energetic presenter a noticeable shift from the typical tone Dan style right this is just a glimpse of the advancements we'll bring into the world we're dedicated to enhance in every aspect of our technology to provide now I don't think this demo actually showcases how good the technology is very happy I think

### My Thoughts [11:22]

this clip here actually does it so just take a look at this clip I am very happy I am so upset now I got to be honest the voices don't sound that great but I think something that I currently don't understand and I guess probably you know if we use aam's Raz it's probably simple that you know the software like 11lbs and other companies that are in this space they really really should add some emotions to the text that you have because you know a lot of the times what allows voice to sound real is emotion so for example if I were to say oh I'm really happy you know that sounds genuine but I was like H I'm really happy that sounds you know there sarcasm in that so I think the subtle nuances are going to be pretty difficult for AI to emulate I mean I guess it's kind of a good thing that AI can't emulate those things because um you know for deep fakes and stuff like that that's going to be of course good but I do think in the future that you know someone who does use these AI softwares for like business videos and training videos and stuff like that I think it would just help a lot because the problem is not deep fix and stuff like that the problem is that when voices sound monotone they just lose your like honestly attention and it's honestly like something that I try to keep in mind when I'm speaking because I don't want to sound like okay this is what happened today I actually have to you know speak loud then speak weird then sometimes speak loud then times speak lowly then sometimes actually slow down then sometimes change the pitch in my voice so that people actually do pay attention then we have

### Phoenix Generation 7 [12:49]

Phoenix Generation 7 major improvements all delivered in less than one year our next generation of general purpose technology marks an inflection point in task or automation so we can see here that this is their new yeah Sanctuary AI if you didn't know are a company that do produce robots that are pretty much general purpose now they actually showcased a recent demo earlier this year that I think was quite understated because it did show full autonomy in certain tasks that were pretty good and I think them upgrading their robot is a pretty good step in terms of where we're trying to get to for general purpose humanoid robots so I do Wonder though if they're going to do some more demos and of course combined with their Vision system which is now called genate Generation 7 if they're going to actually you know add some of the things that some of the other competitors are doing like reasoning via a large language model or using an actual base of the model because one of the things that we don't see for Sanctuary at the moment is that we don't see legs in this model now whil yes let me actually State this for a fact the model does actually have legs but the problem is that we haven't actually seen them we yet in any kind of demo so I'm guessing that it might be a little bit harder than it looks now this is

### Fake AI Audio [14:04]

something that is uh pretty crazy in terms of pretty scary now essentially a Baltimore County principal was seemingly caught on recorded audio making blatantly racist and anti-semitic comments but after investigation the audio was fake AI generated and a plot by the school's former athletic director now I'm not actually going to play the clip here because explicit I guess you could say uh and I don't want the video to get demonetized so you know you can watch yourself but the point is that I think that this is pretty insane because the audio itself was actually realistic like when I heard the audio myself I was like wow that actually sounds super realistic I mean they clearly had a professional edit it because it wasn't just like something that was copy and pasted from 11 Labs um and I'm guessing that maybe they use some open source software because usually what you can do with 11 Labs is they can check the history of what people have generated and then they can easily find the person who generated that text but I think this is pretty scary because you can see here it says it not only led to his temporary removal from the school but also triggered a wave of hatefield messages on social media and numerous calls to the school the recording also caused significant disruptions for the staff and the students so it was pretty crazy guys like this is something that is absolutely incredible because this is one of the first cases that a story has gone viral where AI technology has been used in a way which is unfortunately something that we have predicted but it's pretty awful and this is something that is quite frustrating because whilst yes People Like Us just want to use the technology for enjoyment entertainment work whatever productivity there are people like I've stated before that will use this for their own gain and for whatever reason it may be so I think this is uh going to set a different precedent where we're going to have to use sometime Ty kind of technology to potentially you know verify whether or not Clips are AI generated or not and maybe there's going to be some new legislation that any software that operates and broadcas services to the United States must have some kind of synthetic ID to where that can be replayed back and can be understood to be AI generated because stuff like this I got to be honest guys this just isn't good for the future like we can all agree on you don't want to be someone who's completely innocent and then someone um you know gets a clip of you from a WhatsApp voice note or from like a video or like a video you made on Twitter or like a Snapchat you had 5 years ago and then has you fired and your entire life disrupted which is pretty incredible now perplexity

### Perplexity Voice [16:34]

actually added Voice available for iOS and pro users and I think this is going to be a really good change and if you don't know what perplexity is they're not paying me to say this but I think this is something that saved me hours and hours and hours of research time sometimes it doesn't find the answer but I got to be honest with you guys this is a fundamentally gamechanging tool I'm not sure why Google hasn't bought them yet because a lot of people who do use them don't really have bad things to say about them some people will say that they're just a Google rapper but honestly I don't care what they are because ease of product use is the biggest thing like a lot of people will be like oh their product is just xwi said it doesn't matter okay if the product is easy to use people love it and people want to use it then that's it the product has done what it's needed to do uh and this a product that has really changed the game so like I said before Niche products that do Wrap certain llms will provide a substantial boost for overall companies in the sense that their llms get used a lot more but this is something that I do think showcases how effective llms are in research and it's something that I would say if you haven't used it definitely try out especially the free version then

### Nvidia DGX H200 [17:33]

we could see here and this is a pretty iconic moment it's kind of like a I guess you could say a nostalgic moment because the first Nvidia dgx h200 in the world hand delivered to openai and dedicated by Jensen to advance AI Computing and Humanity now this is pretty crazy because there was something like this before because here we can see where Jensen hang actually did this a couple of years ago and you can see that musk was here of course at the time um openai was founded and you can see here it says to Elon Musk and the open AI team to the future of computing and Humanity I present to you the world's first DG X1 and we can see now that he's doing it again so I mean it's a nice little callback that I think is pretty interesting and what's funny is that you know so many people were asking where's Ilia satova because you know it's something that is a burning question in many people's minds but as I've said before seems that he just doesn't want the Limelight his lawyers are probably said to just stay silent I mean I genuinely have no idea this is literally just pure speculation it's clear that he may not want to say anything or just honestly have no idea I really don't want to speculate too much because I don't want certain ideas to get floating around based on where he way not may be because I think we like owe ourselves and the people of the future A Better World um I think we should go off and figure out how to give everybody on earth a great education and cure every sees and uh have great entertainment and go explore space and offer new physics and um create more abundance because I think we like owe ourselves in the so yeah that was Sam Alman speaking at Stanford and there was a lot of stuff you know that he said at Stanford but unfortunately that it was like a private talk and so many people actually did want to go in and talk to Sam mman when he was there and it's understandably why because the future of the world is not pretty much in his hands but there's a lot of stake here and uh he did say that a new model was certainly coming this year so I think also with along with the some of the leaks that we've heard that June is probably going to be the month that we're going to get a new model it may be GPT 5 it may not be gp5 but according to Bloomberg and several uh pretty good sources that is apparently the date at which we're probably going to see a new model so that's definitely something to keep your eye out for
