# Boston Dynamics New Autonomous Update Is Incredible (Boston Dynamics Atlas 2024)

## Метаданные

- **Канал:** TheAIGRID
- **YouTube:** https://www.youtube.com/watch?v=VRTyLzTtcqQ
- **Дата:** 31.10.2024
- **Длительность:** 15:27
- **Просмотры:** 32,701
- **Источник:** https://ekstraktznaniy.ru/video/13878

## Описание

Prepare for AGI with me - https://www.skool.com/postagiprepardness 
🐤 Follow Me on Twitter https://twitter.com/TheAiGrid
🌐 Checkout My website - https://theaigrid.com/


Links From Todays Video:
https://www.youtube.com/watch?v=F_7IPm7f1vI&t=1s&pp=ygUPYm9zdG9uIGR5YW5taWNz

Welcome to my channel where i bring you the latest breakthroughs in AI. From deep learning to robotics, i cover it all. My videos offer valuable insights and perspectives that will expand your knowledge and understanding of this rapidly evolving field. Be sure to subscribe and stay updated on my latest videos.

LEMMiNO - Music
https://www.youtube.com/watch?v=b0q5PR1xpA0
https://www.youtube.com/watch?v=xdwWCl_5x2s 
https://www.youtube.com/watch?v=rlaG7gF7qeI 
CC BY-SA 4.0

Was there anything i missed?

(For Business Enquiries)  contact@theaigrid.com

#LLM #Largelanguagemodel #chatgpt
#AI
#ArtificialIntelligence
#MachineLearning
#DeepLearning
#NeuralNetworks
#Robotics
#DataScience

## Транскрипт

### Intro []

so today Boston Dynamics finally unveiled their Atlas robot but this time the unveiling was of the robot doing actual moving tasks this video is from the Boston Dynamics YouTube channel in which you can see their humanoid robot performing a range of different tasks relating to loading and unloading some bins now I think this is one of the most impressive demos we've seen as of recent because this isn't any kind of polished or you know extremely curated demo this is a robot working in the kind of environment that you would expect from a standard employee in various different companies in various different settings now the craziest thing about all of this is that it says the atlas is autonomously moving engine covers between Supply containers and a mobile sequencing Dolly and the autonomous feature that I think most people are underrating is the fact that this means

### Autonomous [1:00]

that Atlas performs the task without need for human intervention or remote control and this entire thing is 2 minutes and 30 seconds long it's doing this completely independently using pre-programmed logic and realtime censor inputs to decide its actions and the autonomy here highlights atlas's ability to carry out repetitive and potentially dangerous tasks safely and efficiently on its own now of course the engine covers these are the components that Atlas is tasked with moving these engine covers are often heavy bulky and

### Components [1:40]

awkwardly shaped which makes them challenging to handle especially with precision and this highlights atlas's capability to manipulate large unwieldly objects with care which is a critical skill in industrial environments and the supplier containers are essentially just the storage units that likely hold the engine covers the mobile sequencing Dolly on the other hand is a movable cart or platform that helps organize the position or position these engine covers for the next step in a production process and atlas's role is to move the covers between these two points indicating its involvement in the assembly or manufacturing line now I think that when you really start to understand just exactly what this demo showed us you understand that the future of Robotics is in Boston Dynamics now the description states that this robot that you're seeing receives as an input a list of bin locations to move Parts between so Atlas is essentially provided with a list that contains the locations of all the bins where the parts are stored and where they need to be moved the list is mapped out in advance and is used by the robot to perform its tasks accurately and by having predefined Lo locations Atlas is then able to navigate its workspace efficiently without needing new instructions for every single action and we can see these points that I'll get into in a moment on a screen right now atlas's tasks involves moving these items between these bin locations in a systematic and repetitive way this essentially indicates that Atlas can handle sequence-based tasks which are common in logistics or production lines where moving parts to and from specific locations is crucial for a streamlined process which is why many have been joking around on Twitter stating that

### Machine Learning [3:40]

this product is one that is going to replace a lot of Amazon warehouse workers but if you've ever paid attention you know that they already have a significant amount of robots in their Workforce Now Atlas actually uses a machine learning Vision model to detect and localize the environment fixtures and individual bins which is what we can see on screen where we see these red dots and this model has been trained on a large data set of images to recognize and understand the different components of the environment visually the machine learning model gives at list the ability to differentiate between different objects such as bins containers or other environmental elements now detect and localize essentially means that identifying objects like bins or containant within its environment and localizing which is what they said in the description basically just means that determining their exact positions in threedimensional space and this is really critical because Atlas must not only know what the objects are but also understand where they are in relation to itself so it can move toward them or interact with them effectively now of course these environment fixtures and individual bins are essentially structural elements like racks or support frames that Atlas must navigate

### grasping policy [5:00]

around and Atlas must accurately identify and locates these elements to interact with them properly during its task now one of the things they also state is that this robot uses a specialized grasping policy and continuously estimates the state of manipulated objects to achieve the tasks this specialized grasping policy is essentially where Atlas has a specific set of rules or approaches for picking up different types of object objects the grasping policy is designed based on unique attributes of the item it handles it might include factors like the optimal grip strength the best angle for approaching the object and how to adapt to irregularities which we see later on and this helps to ensure that Atlas can pick up and carry objects securely without damaging them or dropping them now it also says that you know in this statement it says that the robot uses a specialized grasping policy and manipulated objects to achieve the task now after Atlas picks up an object it doesn't just assume everything is fine instead it constantly monitors the object while being moved the monitoring includes checking if the object is secure if it is starting to slip or if it has become misaligned and continuous estimation is essential for ensuring that the robot can adapt to unexpected conditions and successfully complete the task which is what we saw when it manages to make a mistake now of course in this object the manipulated objects are going to be these engine covers which is where it's able to manipulate these in a incredible way and Atlas must be always aware of the state of the objects that it is handling this enables it to make on thefly adjustments if it senses that something isn't right of course there are no prescribed or teleoperated movements and all of these motions are generated autonomously online now when they say no prescribed movements this basically means that unlike robots that follow these

### no prescribed movements [7:00]

pre-recorded demos or strictly pre-programmed movement paths Atlas does not use fixed movement sequences instead its movements are dynamically determined based on the task and the environmental conditions it Encounters in real time there's also no teleoperation just essentially refers to a human controlling the robot from a remote location and in atlas's case there's no human operator actively controlling its action is acting independently and making decisions as needed now generating autonomously online just means that these actions are being generated in real time rather than relying on pre-planned sequences Atlas continuously analyes the current state of its environment and then it uses this information to generate the appropriate movement on the Fly this is the realtime adaptation that allows Atlas to handle Dynamic and unpredictable changes in its surroundings now I think one of the really important things that we saw in this video was the ability for the robot to detect and react to changes such as action failures for example failure to insert the cover or tripping and environment Collision so Atlas is equipped with the ability to detect unexpected changes in its environment and respond accordingly this is really important because many of these workplaces in these factories where these robots are going to be pretty Dynamic you know you've got things like tools containers or even other workers that could move unexpectedly now of course the moving fixtures which is where it's able to react to these changes is that if one of these containers that we do see here if it shifts position Atlas is able to detect this movement and adjust its approach to match the new position this adaptability ensures that Atlas can continue working effectively even if the workspace isn't perfectly stable now of course this failure to insert this cover that we see here it's absolutely really incredible that we see this combination of vision force and proprioceptive sensors being able to allow Atlas to fix this quickly the vision sensors basically mean that it's the camera or other Imaging devices that help Atlas see its environment and detect objects and determine their position the four sensors basically measure the physical forces that Atlas encounters during its tasks such as how much pressure it applies while gripping an object or detecting if there's unexpected resistance while moving and the proprioceptive sensors basically help Atlas understand its own body State

### versatility [9:40]

such as angles of its joints the positions of its Limbs and its balance and this is basically similar to how humans can handle their sense of their own body parts without looking at them now one of the key things that I actually did Miss in this demo was something rather subtle but it does show that at is a versatile robot engaged in many different Dynamic tasks and many different Dynamic use cases one of the things pointed out by the humanoid Hub was the fact that if we take a look at this in slow motion we can see that atlas's three finger hand/ gripper is doubling as an opposable thumb this allows Atlas to use its hand in multiple different ways we can see it's able to use its hand to grip this out and then it's able to flip that over and then pull this out it's like some really Advanced tongs or tweezers but in robotics this means it's a lot more adaptable to many different scenarios it's quite like having an arm that you could change for many different Dynamic situations I think this kind of thing did get overlooked it was something that I didn't pick up the first time watching this but on seeing this on Twitter the humanoid Hub which is a great Twitter account for all humanoid robots I saw this in slow-mo and my mind was blown once again so this is where they're using potentially different areas of dynamic control to be able to use these robots grasp different things and have them adapt to different scenarios another subtle thing that I did notice was the fact that there have been two demos so far of the atlas robot one of the earlier iterations that we saw was one where we got to see the design and presentation of the prior Atlas robot appeared to be rather polished this demo the one you can see on the right looks as if it is a lot more rough and rugged which potentially indicates that this version of Atlas the one that we got to see today was engaging in a lot more demonstrations I mean the left side basically shows Atlas that is in what seems to be a controlled indoor environment possibly in a display area where it's not been exposed to the rigers of real world operation which makes it easier to keep the robot looking pristine but of course the robot on the right hand side looks a lot less polished but of course it's handling heavy objects encountering a lot of dust dirt and probably a lot of wear and tear from these repetitive tasks but overall it does seem like these are very similar iterations but I also want to show you guys something as well so you do remember the fact that I talked about how we have this robot here and it's able to switch its hands to be able to do many different things one of the things that I recently saw that was absolutely outstanding was the fact that you can see that Optimus in this short demo is equipped with different hands now the hands that we just previously talked about were ones that enable it to grip certain things but if we look at this specific demo the hands are quite different there's like some rubber half ball shape and that is allowing it to perform these push-ups well so essentially what we are seeing here is that this Atlas robot is able to have Dynamic body part changes that allow it to interact in certain environments to be more flexible and more useful this is something that I think is quite underrated because a lot of robots don't have that modularity where you can swap parts in and out but the atlas robot does have this feature which means you can simply swap out the hands and then set it off to do different things I think this feature is remarkably underrated as most people didn't see this video but it was one that they showcased when they were showing the flexibility of this robot and this one even looks a little bit more polished than the prior one that was working long hours in the factory and lastly I do want to mention that this robot's

### movement [13:45]

movement is largely one of the most uncanny that I've ever seen one of the things that I've seen people continually complain about is the fact that these robots are remarkably Too Human it's almost as if we're anthromorph asiz these robots and making them less efficient as a result what if we just made these robots just be entirely efficient for the task how would they look and perform and I think we're starting to see this with the atlas platform often times when this robot is moving to and from these different locations we can see that it moves its torso in a 180° Manner and it also moves its legs in ways that humans simply wouldn't be able to do I think this does give us this unnerving feeling about Terminator robots but I think it just makes the robot a lot more effective like right there you can see it turned its head on a swivel and then just turned around its body and was able to immediately get into the right position I mean it looks unnatural but you have to remember that this is not a human it's not like some kind of you know adolescent this is a complete artificial intelligence robot that doesn't have anything to do with ass at all so while it might look a little bit weird it's completely natural and normal and I think we need to expect that when we're looking at Future robot releases like right here when it manages to do a complete 360 on its head then move the legs around and start walking towards it it's something that will never get old so if you did find this completely interesting let me know what you thought about the video and I'll see you guys in the next video
