OPEN-AI'S FIRST PHYSICAL ROBOT SHOCKS The Entire Industry! (FINALLY ANNOUNCED!)
10:43

OPEN-AI'S FIRST PHYSICAL ROBOT SHOCKS The Entire Industry! (FINALLY ANNOUNCED!)

TheAIGRID 04.04.2023 378 874 просмотров 2 053 лайков

Machine-readable: Markdown · JSON API · Site index

Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI
Описание видео
OPEN-AI'S FIRST PHYSICAL ROBOT SHOCKS The Entire Industry! (FINALLY ANNOUNCED!) Welcome to our channel where we bring you the latest breakthroughs in AI. From deep learning to robotics, we cover it all. Our videos offer valuable insights and perspectives that will expand your knowledge and understanding of this rapidly evolving field. Be sure to subscribe and stay updated on our latest videos. https://www.1x.tech https://twitter.com/1x__tech Was there anything we missed? (For Business Enquiries) contact@theaigrid.com #LLM #Largelanguagemodel #chatgpt #AI #ArtificialIntelligence #MachineLearning #DeepLearning #NeuralNetworks #Robotics #DataScience #IntelligentSystems #Automation #TechInnovation

Оглавление (3 сегментов)

Segment 1 (00:00 - 05:00)

note the title isn't clickbait open ai's first physical robot is going to come to you very soon if you take a look at this we can see here that Norway's 1X raised 23. 5 million dollars and an open AI LED round essentially open AI has just invested and led an investment round for a previously owned robotics company that used to focus on security that is now building a custom robot that is going to be focused on integrating AI technology into real physical robots they recently just announced that very soon okay in the top right hand corner you can see that in summer 2023 which is only around three to four months away that we're going to be getting the first demo slash reveal of this physical robot that will have ai related features now this is absolutely insane news because I thought that we were years away from this and you can see that right here that on the web page it also says AI embodiment and additionally it says our newest Android iteration Neo will explore how artificial intelligence can take form in a human like body now if we look further at this webpage you can see that right here there are four companies that are partnered with this new robotics company there is ADT commercial which is a security company there is Tiger Global which is an investment firm there's an Nvidia company which of course everybody knows is the company that makes the gpus that runs the software and of course last but not least we have openai which is the only company on the partner list that makes large language models so it's clear that when this company does come to choose a robot to which they want to deploy their large language model to it's clear that openai has clearly chosen this company now I might be jumping the gun here by saying this but I'm pretty sure that this is exactly what openai is going to be doing when they want to deploy their chat gbt or their large language model GPT 4 or GPT 5 into physical robots that can interact with the real world I have to be honest I didn't expect that the AI landscape was going to develop this quickly from large language models to physical robots Just In Summer that is definitely pretty fast you can see right here that it says 1X is thrilled to have open AI lead this round because we're aligned in our missions thoughtfully integrating emerging technology into people's daily lives with the support of our investors we will continue to make significant strides in the field of Robotics and augment the global labor market that's what the CEO said the founder of the company now later on in this video I'm going to show you how their previous robots were and to be honest they were pretty good they also talked about how they need to deploy these Androids in the real world they're playing our wheeled Android Eve at an unprecedented commercial skill gives us a unique understanding of the challenges and opportunities the robotics Community has yet to address if Androids are going to work in a world they need to experience our world and we already know that with gpt4 with its multimodal capabilities this is something that can definitely happen and it can definitely power these robots now let's take a look at what these robots could actually look like based on some of the previous models that they did create and based Taste of some of the emerging technologies that we've seen from some of the top tech companies working on AI robots as we take a look at their YouTube channel you can see that some of the robots that are very good at what they do were released and we're doing things years ago two years ago three years ago five years ago these robots were moving very effectively in many different modes helping people in many different scenarios and right here you can see these robots performing their daily functions with very decent amounts of accuracy and this is actually pretty shocking because as someone that didn't previously look into the AI space that much we can see what these robots are truly capable of now remember these robots here are powered by a different system that is unknown to us right now but I'm guessing that since chat GPT is so much more powerful these robots if they are going to be powered by chat gbt some of the best robots on the planet I mean imagine this robot right here that has the ability to precisely move different things in the physical world and interact with certain things and with the multimodal capabilities of gbt4 where it can literally look at an image and understand exactly what's going on it's going to be really interesting to see exactly what these robots can't do now what's going to be interesting is just seeing when this release date is actually going to be we know that on the landing page they did say that they're going to debut this in summer of 2023. I think this is just going to be a live stage demo where it's going to be on the stage and they're going to show this robot off like many other situations have happened before but it's going to be really interesting to see if it is going to be powered by chat TPT 3 3. 5 or chat gpt4 I mean I couldn't really see any other company that is going to be powering these robots especially since open AI is now looking to deploy that multimodal capability as the AI race heats up and this is why guys you have to understand that other companies are also in this race too I mean let's take a look at Tesla's robot this is the Tesla board which was recently announced

Segment 2 (05:00 - 10:00)

at Tesla's AI day this is a robot which is definitely functional and apparently it is powered by ai2 so this is some steep competition for the levels of Tesla and open Ai and also Google which we are going to be talking about later because they also have their multimodal AI which is actually powered by their largest language model to date and it does stuff that we honestly didn't think we'd see for many different years what's also very interesting is you can see The Uncanny resemblances between these AI models let's take a look at this model right here you can see that this model also looks very similar and this is another walking robot that is actually also powered by AI now this robot is powered by nvidia's Jets and Xavier AI which actually has multiple different sensors and GPS models to interact with the environment and this is really interesting because it just shows us exactly what we could expect when openai does finally release this robot with this company called 1X now I'm not sure if they're going to do this in summer maybe there might be some delays based on the reports we're hearing where there's so many people calling for the Slowdown of AI but it definitely could be interesting to see gpt5 in its first physical robot and it does look so much similar to the 1X and to the Tesla board I mean they all pretty much share the same head chip and it's definitely interesting to see how these companies are going to out-compete one another and it seems like this AI race is not slowing down despite the recent report and despite the recent AI petition now what's interesting is that I did honestly think that openai would actually take the same route that Google did you can see right here that this is a Google robot that is interacting with the environment now if you're wondering what this is powered by Google's unreleased palm e large language model which interacts with the environment in a very interesting way it actually understands exactly what's going on can do multiple different tasks and can respond even if the given task is interrupted by a different person which would actually happen in real life so this is a really smart language model that can do many different tasks now what was really cool about this large language model by Google is that it could even do tasks that it wasn't even trained on before and these were visual tasks that it had never seen before and it could still perform these tasks with a very high degree of accuracy now the task that it was doing in the video was just simply getting the person a bag of chips know that these robots are capable of much more because right here you can see that palm e is able to perform many different tasks now like I said this is what I thought open AI was actually going to do but they've actually shocked me this time because they are not going down the route of Google they don't just want these kind of one-armed robots that can perform specific functions it seems as if they really want to be able to deploy real robots in the real world that can interact with real people and it seems like they want it to be powered by gpt5 or gpt4 consider considering diets the only company on the partnership area that has a large language model so if this does release in summer of 2023 this could be one of the biggest game changers in the entire AI industry because it will not only be the world's largest language model the most powerful language model it is arguably going to be one of the fastest best most agile robots we've ever seen now you've seen what the robots look like a couple years ago just imagine the kind of technology that they could be working on now especially with the development from openai Nvidia and tiger Global this is going to be absolutely huge remember these are robots that are doing stuff that isn't powered by chat tbt's multi-modern input so while you're watching this just understand that this was most likely powered by a less powerful model and if we have something more powerful it's going to be interesting to see exactly how precise it is and how accurate it does now there could be some fails but I highly doubt that with open ai's track record of just absolutely destroying the competition in recent and months this is going to truly shake up the industry because not only will they have the smartest robot they might have the most agile one on the market too so it seems like a market leader is being emerged and placed at the top right here and it seems the openai might just become one of the most powerful companies in the next couple of years now did you expect AI to go this quick this fast I know that I certainly didn't expect it to transform this quickly I mean people were scared when they did see Boston Dynamics perform many different crazy tasks but it wasn't powered by such a powerful large language model and Boston Dynamics had to be trained on that specific tasks many different times and many different times it would consistently fail whereas with this large language model that is chat gbt and gbt4 it actually does Excel far more consistently so it's going to be interesting to see in summer 2023 exactly how they demo this and what it's going to be powered by maybe it won't be powered by chat TBT maybe it will be powered by a different version but I just can't think that they wouldn't use the multimodal output that they've been developing for so long on anything else

Segment 3 (10:00 - 10:00)

other than this robot there is one thing that I did forget to add on their Twitter account which they don't even have much followers only 2 000 followers and they barely get any retweets they did actually retweet this tweet right here where someone said lots of great type about the large multimodal models right now but do we get to trillions of tokens for embodied actions and this basically just means when do we get actions that are embodied in the real physical world and they actually retweeted this talking about multimodal models which is of course referencing chati BT's new multimodo model so of course this is definitely a hint towards what we could be seeing in the summer of 2023 which was teased by them on their official webpage

Другие видео автора — TheAIGRID

Ctrl+V

Экстракт Знаний в Telegram

Экстракты и дистилляты из лучших YouTube-каналов — сразу после публикации.

Подписаться

Дайджест Экстрактов

Лучшие методички за неделю — каждый понедельник