# The Latest Humanoid Robotics Breakthroughs You Need to See

## Метаданные

- **Канал:** TheAIGRID
- **YouTube:** https://www.youtube.com/watch?v=RHYYC97ir5w
- **Дата:** 07.12.2025
- **Длительность:** 42:48
- **Просмотры:** 25,301
- **Источник:** https://ekstraktznaniy.ru/video/12597

## Описание

Checkout my newsletter : - https://aigrid.beehiiv.com/subscribe
🐤 Follow Me on Twitter https://twitter.com/TheAiGrid
🌐 Learn AI With Me : https://www.skool.com/postagiprepardness/about

Links From Todays Video:
https://x.com/tonyzzhao/status/1991204839578300813 
https://x.com/sourccey/status/1990903761187828199 
https://x.com/tangiblerobots/status/1990467217452843022 (eggie)
https://x.com/svlevine/status/1990574912407539756 (physical intelligence)
https://x.com/Realnitesh945/status/1996180277979472036 (hanzou robotics ai powered traffic cop)_
https://x.com/Robo_Tuo/status/1995314021281771644 (sanitation robots)
https://x.com/XPengMotors/status/1987837648958828994 (xpeng)
https://x.com/XPengMotors/status/1988517894217449572 (xpeng )
https://x.com/1x_tech/status/1983233494575952138 (1x neo)
https://x.com/TheHumanoidHub/status/1989364406850044284 (mindon - unitree)


Welcome to my channel where i bring you the latest breakthroughs in AI. From deep learning to robotics, i cover it all. My 

## Транскрипт

### Segment 1 (00:00 - 05:00) []

So, there have been a ton of different robotics breakthroughs and releases products in the last month. So, let's take a look at some of the most important and some of the ones that you probably didn't miss. Starting with Sunday Robotics. Sunday Robotics is a US-based startup building a general purpose home robot called MIMA. This robot is focusing on real domestic chores like cleaning tables, loading dishwashers, you know, folding laundry rather than humanoid locomotion. Now, the company describes itself as a helpful robotics company aiming to give people back its time by automating repetitive household tasks. Company led by Tony Zhao. It actually attracted a huge amount of venture funding from firms including Conviction and Benchmark. Now, I think one of the most interesting things about this humanoid robot, and remember this is in five times speed. So, whilst yes, this is impressive, the demo is sped up. And as I was saying, one of the most impressive things is that this kind of humanoid just completely ditches the bipedal motion. It opts for a wheelbase, which in many home environments is more than effective. And I guess they're opting to, you know, focus their time and attention to the torso, which is where I guess you could say the value is provided. I don't think most humans would care if their humanoid can walk up or down stairs, considering that most places don't have stairs, but if you are going to be affording this robot, I'm guessing that you probably will have stairs considering the house might be a bit more expensive. I know that is quite the speculation, but I think this goes to show that different approaches to the same problem via different companies can lead to different outcomes. Now, this tall wheeled domestic robot was has, you know, a really cool vertical lift that you can see. So, it's able to go all the way up and all the way down. It's able to shift that base vertically, move forwards and backwards. Is really, really effective for navigating the complete home. You can see that they've got their act one which has zero shot generalization, meaning ideally it can just look at a task and then complete it. Now, what's really interesting is that this robot can apparently perform long horizon tasks such as clearing a messy dinner table, sorting items, loading a dishwasher, and handling fragile glassear, and folding piles of socks, all with smooth autonomous motions. Now the training approach which is the glove in act one. Sunday actually avoids the traditional teleoperation and instead ships a skill capture glove to memory developers which are users who perform the chores in their own homes while the system records the motion and the force data. And then this data feeds the Sunday act one robot foundation model which they claim is trained on zero robot teleop data and can generalize to new homes and layouts for complex multi-step tasks. Now, the current status and availability of these robots, maybe you want one, is that Sunday reports having shipped over 2,000 gloves and collecting data from around 500 homes to expand Memo's skill library. The company has announced an invite only three beta program targeting deployment of memo into about 50 households in 2026 with broader consumer availability likely following that initial trial phase. Now how does this compare to the landscape? Well, unlike bipeal efforts like Tesla, Optimus or figures humanoid, Sunday prioritizes a wheeled semihumanoid form factor tailored to kitchens and living spaces. Now, this trading stair climb, they essentially trade the stair climbing ability for stability and runtime. Next, we have Sourcey. Now, Saucy is an open-source home robot, like a personal home robot designed to assist with household chores while serving as an educational tool for learning robotics and AI. Now, this was launched as a customizable userable companion which allows its users to teach it tasks through demonstration, adapting to specific home environments over time. It's built to be approachable and friendly, bringing a sense of joy to daily routines rather than just functionality. Now, this robot features some key things. It's got trainable chores, so users can actually demonstrate tasks like cleaning, organizing, or other household activities. And Sulsey uses AI to learn and improve more and more training sessions, leveraging compatibility with the Larot framework, an open- source

### Segment 2 (05:00 - 10:00) [5:00]

robotics library. Now, what's cool about this robot, unlike the previous one, is that it's got a completely open- source design. And this means you have full access to the source code, the APIs, and documentation. And this is all provided via the GitHub, enabling customization, extensions, and communitydriven improvements. Now, this is ideal for beginners in robotics, programming, AI, as it doubles as a hands-on learning platform without requiring advanced technical skills. Now, what's really cool about this is that it's actually available for the official site OS calledsy. com. And this robot is actually standing out in the growing field of domestic AI assistants like Neo or MIMO by emphasizing on the open-source accessibility and learning by doing, making it particularly appealing for hobbyists, educators, and tech enthusiasts. So, of course, if you're interested in building or modifying on it, the GitHub repo is a great place. starting place. Now, I think this is going to be super interesting for those of you who want to focus on your own personal style of robotics because there are many different humanoid robots that are out there, but one of the key barriers to entry is very high costs. The Unitary G1, I believe, costs around 16 to $20,000, which is just simply not feasible for the average person. And I do believe that the Neo robot costs once again around $20,000 or $500 a month for a home robot. So overall, this open-source framework, I'm not entirely sure how much this robot costs. In fact, it literally just came up. It starts at around $1,500. It's a lot cheaper. It may not be the most aesthetic, but I guess you could say this is more like a developer platform where you can use this to learn quite a lot about robotics and use it in your home, provided you do the learning via demonstrations. Now, there was also this which I found super cool. There was the sanitation robot competition in Shenzhen. is basically a realworld robot contest where dozens of companies deploy autonomous cleaning robots on actual city streets and public spaces to demonstrate and benchmark their capabilities for urban sanitation work. It's essentially part of a tech showcase which is aimed at accelerating the rollout of AI sanitation fleets for Chinese cities. So, the competition is formerly known as the 2025 Shenzhen International AI Sanitation Robot Competition and related exhibition segments under Nanny Robot. It is organized by the Shenzen Municipal Government and Lyang District together with National Media and Industry Associations as flagship AI and sanitations initiative. Now, reports describe more than 40 sanitation robot enterprises participating with over 200 autonomous robots from 40 plus companies in some segments competing or demoing in over 100 individual sessions. Urban management departments from over 200 cities and more than 500 enterprises were invited as observers and potential buyers. Now, robots are evaluated in real scenario tracks such as parks, squares, sidewalks, and cycle lanes and service/ auxiliary roads rather than in those pristine lab environments. And the tasks include sweeping, vacuuming, spraying, trash pickup, bin handling, climbing, and navigating complex terrain while avoiding pedestrians and traffic. Now, a big emphasis is on fleet level autonomy. Many robots run fully autonomous navigation on open sidewalks and roads coordinated over 5G and IoT infrastructure. Some segments highlight integration with open harmony and M robots operating system and smart city systems to show end toend urban cleaning workflows meaning complete autonomy of cleaning. Now, this competition is explicitly tied to procurement and standard setting with draft national technical requirements for AI cleaning robots presented and on-site matchmaking for government orders. Shenzen's Yongyang district has committed large volumes of future robot procurement and pitches this as part of building a 100 billion yuan plus intelligent sanitation market and full robotic supply chain. Now if we're talking about China, we can talk about the Chinese startup Mindon which actually trained a unitry G1 to do house chores. Interestingly, there was no speed up and no telly operation. This is a small new robotics company coming out of Shenzen and there wasn't any information prior to this. They just

### Segment 3 (10:00 - 15:00) [10:00]

came out of the woodwork and said, "Look, we're going to drop this robot demo. " Now, this one shocked everyone because the robot moves fast, fluidly, and naturally, and it's way more lively than what we normally see from big companies like Figure or even Tesla. Look at it. It's using its knees to get on top of the bed and iron out some creases. Incredible. Now, the robots in the demo, you know why this is so impressive is because it opens blinds quickly. Fast contractrich motion. The curtains are soft and hard to simulate. So, the smoothness, when we think about it, that's rather impressive. Being able to water plants while carrying a heavy watering can and balancing while holding weight and stepping onto a platform while reaching even more plants is uneven terrain. This is a hard robotics problem. I don't think people are understanding the scale of just how hard that problem is for robots. These actions are very humanlike. I mean carrying a gift box to kids. This shows good whole body control even whilst you have an object in your hand. And this is similar to recent research papers such as Res Mimic. And once again, crawling onto a bed and ironing it is such a weird task, but it's technically impressive. Climbing and crawling motions are usually hard. They require advanced retargeting techniques such as omni retarget and you know cleaning a room while kids plays toys are everywhere proving that this is not just entirely scripted. It means that the robot somehow is seeing objects and reacting. Now it's incredible because someone wrote a Twitter thread on how they probably did this. They said that the behaviors of this robot likely came from policies trained on human videos or mocap and not teleop and modern imitation learning techniques such as res mimic, HDMI or omni retarget and possibly UMI for grasping. And this is actually aligned with the latest big research breakthroughs in robotics. And clearly Mindon has stitched all of these, you know, recent breakthroughs together and performed this demo. Now, some people were even speculating that this demo is fake. Now, everything in this demo is actually achievable with today's methods. The motions look exactly like the state-of-the-art academic demos, just executed much better. Now, this is a completely huge deal. This is just a brand new startup founded in May 2025 by XT10cent researchers and they're already showing fast useful actions and these aren't slow stage demos meaning that the race for embodied AI is wide open. If a you know relatively small company can suddenly leap forward it now means that you know there are new players emerging meaning that Mindon is one of those companies to watch. It'll be certainly very interesting to see where things go moving forward because this is one of the most impressive recent demos I've seen of the Unitary G1 in action doing multiple different tasks with humanlike motion. Completely smooth fluid motion and no speed up and no telly operation. Very impressive stuff. Even with teley operation some of these tasks I believe would be hard. So, for this to be completely done, not completely done, but done already in this impressive demo, it's super impressive. Next, we had a UK-based firm, Humanoid, unveiling a legged version of their HMND01, Alpha. There is a European company releasing their new humanoid, and this is a UKbased startup unveiling Alpha. Now, this is a biped that has mastered stable locomotion in only 48 hours after assembly. This is something that normally takes weeks or months. And the shortcut, if you're wondering, the humanoid ran 19 months of locomotion training in simulation using ultrarecise virtual replicas. And then they transferred the policy to the physical robot. Now, Alpha can walk in curved paths, side step, hop, recover from pushes, run, and lift up to 15 kg. The company is following a familiar road map, industry first, then they're going into services, and then they're going to be moving into the home. Now, they've already tested early prototypes with, you know, bin picking with this company called Shaffler, and there are future deployments possibly planned for hospitality, healthcare, and elderly care. And by 2031, HMND or the company called humanoid wants alpha class robots doing routine household tasks and filling labor gaps in sectors where staffing is hard work and physically punishing. So this is probably one of the first times I've seen a UK-based company other than, you know, Google Deep Mind focusing purely on robotics.

### Segment 4 (15:00 - 20:00) [15:00]

Super interesting. Then we had Unitry deciding to show off their H2 fighting robots despite saying that it's not designed for combat. It was showing robots that were doing exactly that. It's showing the, you know, G1 fighting another human and then another H2. So the H2 is the bigger one on the left and the G1 is the one on the right with the broken arm. So you can see here that the H2 is being pushed around by a human. It's kicking. It's, you know, trying to move. And I think two years ago, this would have probably been CGI. But the fact that we now accept this as normality is pretty interesting because seeing these robots move in such a human-like way has stark implications for the future. What happens when these robots get incredibly good and even better than humans at these fluid-like motions? Are we going to see, you know, robot fighting championships? Are we going to see real steel really be there? If you know, you know. But it's going to be super interesting to see the future of how robots, you know, navigate these truly dynamic environments. You know, sometimes these robots are just kicking out of pure, you know, instructions. There are controllers behind. But what happens when you have a robot that's able to completely autonomously engage in its environment, move forward and backwards, and judge an enemy and attack and, you know, box in a way that's really humanlike. I think those demos are going to be completely surprising. Now, if we're talking less about fighting robots and more about movement, then we have to talk about the Tesla bot Optimus Generation 3 performing a new personal record at 8. 5 mph. This demonstration doesn't seem that impressive if you haven't been paying attention to the area of robotics. And the reason is that it just seems like a robot running across the screen until you realize that it's not CGI. This is a full humanoid. And I would argue one of the first times we're seeing a humanoid that looks incredibly realistic in full autonomy. Now, previously we'd seen demos of the Tesla bot doing different things at, you know, Elon's events where they go and showcase the full entire road map of what Tesla has to offer. But often most people didn't realize that those were completely fully teleoperated, meaning that the demonstrations weren't as impressive as they seemed. But now we have something rather interesting. We're seeing the Gen 3 actually move autonomously across a plane and it looks exactly like a human would if they're jogging. I would argue it would even look more graceful than many humans jogging. So this is really surprising because in order to get a humanoid robot to jog as fluently as it is there, you have to succeed on a remarkable number of individual components that all have to work together well. You have to ensure that the you know humanoid can support the weight as it flies through the air. Though it doesn't seem like it's flying certain points where you know maybe for like a 0. 1 or 0. 2 seconds it is literally airborne and it's able to do that at a stable weight. Now, this isn't to say that maybe they didn't crash or, you know, the robot behind the scenes. I don't know how replicable this is, but it's pretty clear that we crossed the threshold for robots to look, you know, incredibly human. And for those of you who don't believe me, what's so interesting about this is that when we got the Tesla Optimus Gen 3 video, we actually got another demonstration of the figure 3 robot doing the exact same thing. And I would argue that this robot looks really, really impressive considering the fact that it wasn't like this was a scheduled release. This was just a response literally to the Tesla tweet. Someone literally called out Brett Adcock and says, "Look, this is your time to shine. " And he then tweeted this. So, it's quite likely that figure three are probably much more ahead than we even do think because this robot demo wasn't even scheduled. It was only shown to us because people had the perception that Tesla had moved forward and stolen the limelight. But figure 3 were like, "Look, guys, we're still on the same pace. We're just moving, you know, and keeping our heads down. " Now, I would argue that this one looks even more impressive. There's just something a little bit more fluid about this one, just the way it moves forward and then it goes backwards, whereas the Tesla one is just, you know, completely running off screen. Here we can see the robot, you know, jog first, stop, perform a 360 turn around, and then do it in the exact same way, exactly like a human would. I mean, it's uncanny with the subtleness to how the robots move like a human. It really just freaks me out to the point where I almost believe that there's like some human teleop or some human somehow somewhere inside that robot machine. But of course, as you

### Segment 5 (20:00 - 25:00) [20:00]

know, this is not the case. This is a fully autonomous robot that is able to do this. Now, I think once again I need to show you guys where the previous generation of models were. Take a look at figure 2. This was the robot, okay, just a few months ago. Figure 2, the last gen. The last generation of robots from the exact same company. Now, what you have to understand here is that this in and of itself is still remarkably impressive. You've got a robot that is able to walk up and down an uneven terrain which it wasn't trained on. Meaning that the robot is likely just, you know, analyzing and, you know, completely doing this autonomously. Which means that whilst most people think that, okay, the robot is looking like quote unquote Joe Biden as many people have meme'd around, this is remarkably hard for robotics. So this is uneven terrain, but this is of course in a pristine lab condition. Now, by no means am I, you know, removing the capabilities of these robots, I'm saying that this is, you know, much more impressive than you do think considering what we had just 2 years ago, which if we're looking at things, let's say we decide to look at them exponentially, you can only imagine just how quickly robotics are moving. Remember, it was only, I think, 6 to 8 months ago that we saw Boston Dynamics perform similar leaps in terms of the agility with their spot robot or their I can't remember honestly what the name of that robot is called, but it was only when we saw that one did we realize what the true potential was. 6 to 8 months later, we have other humanoid robots catching up. I do wonder if Boston dynamics are going to get leaprogged. Although I do highly doubt that as they are quite the innovators in the space, but this makes me extremely intrigued what the future holds once these robots are deployed at scale and they're able to do a variety of different things. Now, we also had a new demonstration of the Unitary G1 from Magic Lab and this is the Magic Labs Z1. And I mean, this is particularly incredible. robots. I don't know about you, but I don't remember them being able to flip like this, let alone humans that can flip like this with such agility and dexterity. And so, we have the Magic Lab Z1, often branded as the Magicbot Z1. This is a small, highly agility humanoid robot from a Chinese company called Magic Lab. Now, this is designed as a generalpurpose mini biped for research, education, service, and companionship scenarios. Now, this is roughly around 1. 36 to 1. 4 m tall and around 40 kg, and it's optimized for very dynamic movement, ball recovery, and expressive interaction rather than heavy industrial payloads. Now, the core concept here is that the Z1 is built as a compact bipedal humanoid with 24 core degrees of freedom, expandable to around 49 to 50° of freedom using optional modules index to his hands. Its main pitch is agile and explosive motion in human spaces such as fast walking and running and acrobatic like bending combined with relatively low cost to compared full humanoid competitors. Now, the robot uses Magic Lab's self-developed high performance smart joint modules with the joint torque on key axis exceeding about 130 Newton meters, allowing it to withstand hard shelves, repeatedly fall and stand, perform large range poses, and it's, you know, able to jog around 2. 5 to 3 m/s. And it's able to step over obstacles about 15 cm, targeting uneven floors, grass, gravel, and small steps. — And it carries a 360 degree perception stack, including stereo, depth cameras, 3D, LAR, and fisheye or binocular vision, plus magic labs, atom positioning, and navigation system for autonomous navigation in cluttered spaces. Now public materials mention use of embedded compute for example you know jetinclass modules for AI navigation and interaction but most of the marketing that I found focuses on the joint tech and sensor fusion rather than specific CPU or GPU SKUs and the base robot can actually be fitted with the magic hand SO1 which is an 11 degree of field five finger dextrous hand that lets each hand grasp delicate objects or lift a few kilograms around 3 to 5 kg give or take. And the Z1 supports voice interaction, gestures, and anthropomorphic emotional expression, making it suitable for roles like exhibition, guides, demo platforms, or whole educational companions. Now, we also had I couldn't do the video without this one, the Xpang Iron. This is the full size humanoid robot developed by the Chinese EV and tech company Xpang. This is designed to operate in a human environment such as factories, shops, offices, and public venues. And this is

### Segment 6 (25:00 - 30:00) [25:00]

positioned as a humanlike AI, physical AI. And this is the platform that Xpang manages, well, not manages, but hopes to mass-produce around 2026 for commercial deployment. Now, they've built this as a generalpurpose human-shaped worker that can walk, manipulate objects, and interact with people using onboard AI rather than being a fixed industrial arm or a warehouse bot. The Xpang is basically tying it to the same AI stack as its autonomous cars and the robo taxis. So, the perception, the planning, and the control share a common architecture. Now, the robot is roughly humans sized, incredibly humanl looking, and the it's got dozens of joints across the body, highly articulated hands with around 22 degrees of freedom for fine manipulation. And it uses incredibly a bionic bone muscle skin structure with a flexible spine and soft synthetic outer skin to make the motion more humanlike and, you know, safer to come in contact with people. Now, it's powered by Xpang's in-house Turing chips. But I actually want to show you guys that flexible spine soft synthetic outer skin that I was talking about. Take a look at this. So, you can see right here, you've got them cutting open the robot spine and the skin just to show you guys the details. In fact, they didn't do this to show you guys the details. They cut this open because individuals upon seeing the Xpang demo did not or could not believe what they were seeing was an actual robot. They believed what they were seeing were a, you know, human in a suit that was acting as the demo. In all fairness, we've done that in the past. In the Tesla demos, you know, in 2021 or 2022, it was literally a person in a suit. But now, we can see that this is not the case. from ideiation to reality. With Xpang, we now have the complete full road map of what exactly happened. We've gone from people in suits to people, well, robots that actually look like humans in suits. And honestly, this is one of those moments where we start to realize the moment we have crossed into the Uncanny Valley. Now, they're not the only company that's in the uncanny valley. Take a look at the engine T800. This is a fullsized humanoid robot by a company called Engine AI. This is the robot that's built for high agility, high torque, and quote unquote combat capable. And it's, you know, also for industrial tasks. Now, this is roughly human side. And it uses aviation metal structures and it's marketed as a powerful dynamic platform for logistics, service, and potentially security or competition scenarios rather than just a demo. Now, the T800 is designed as a general purpose humanoid that can perform practical and, you know, execute very dynamic motions like kicks and spins and rapid movement. And, you know, this humanoid is focused on, you know, strength, endurance, and multiroll deployment in real environments. Now, you've also got the size, mechanics, and mobility. Recent specs describe the T800 at about 1. 7 to 1. 75 m tall, around 60 to 75 kg depending on the configurations with around 29 body degrees of freedom plus the exteris hands bringing the total into around the low 40 degree to freedom range. Now joints can deliver up to 450 new meters of torque enabling sprints of around 3 m/s and highly high explosive motions. you can see right here such as the flying kicks, the punches, the capera style moves backed by two to four hours of runtime from a modular solidstate battery. Now the robot uses an embedded compute stack that in higherend variations can include Nvidia or in Jetson Thorclass modules providing on the order 2,000 tops for embodied AI workloads. Its perception suite typically combines 360 degree liidar with depth/RGB cameras to support the navigation, obstacle avoidance, and high-speed motion in cluttered spaces. Now, the T800 can also be equipped with Engine AI's own multi-ensor dextrous hands, usually quoted at around 7 degrees of freedom per hand, with tactile sensing and per arm payloads in the 5 to 15 kg range, depending on the source. This lets it handle both core strength tasks and finer manipulation like sorting or precise placement. Now, engine AI, it markets itself for industrial logistics work for moving goods and service hospitality roles. And as it says, combat. So, it's going to be interesting to see if a robot is going to be fly kicking people in the future. Honestly, I really hope we're not moving towards a dystopian future. Maybe just a robot fighting competition. I really do hope that is the extent of things. But I

### Segment 7 (30:00 - 35:00) [30:00]

got to say it looks super cool and these robotic demos just look more and more human as we get day by day. I mean, I think it's one thing to see these robots on YouTube, but I haven't actually seen any of these robots in person. And I think maybe my mindset might shift once I do see these robots running around and doing a variety of human level tasks. So, so should be super interesting to see what that looks like in the near future. We also had the flagship humanoid Lim X OOLI, also referred to as the Lim X CL1 in earlier models or Ali in some demos. This is a full modular size humanoid with 31 active degrees of freedom supporting dynamic stair climbing, heavy payload transport, and complex navigation in warehouses or homes. And we can see just how stable this robot is as it navigates some really rough terrain walking. I mean, it trips and recovers completely. And I can't say I often do this myself. I'm not saying I fall over, guys. I'm just saying that I've seen some very ungraceful falls in my time. So, these robots are becoming extremely prudent at how they walk and how they carry themselves. And we can see this evolving in one-time speed. It's important to note that demo right there is deliberately noisy in terms of the environment because that is the real world. You want to test the world in unsexy environments so you can really see how well it performs on the most difficult, you know, areas because a realworld lab where everything's perfect and pristine and you've got the, you know, yellow box that it's supposed to walk around, that isn't how the real world works. Sometimes there might be a slew of different materials by QC in this video and it's important that the robot is able to navigate that environment successfully without additional help. and to be able to do it completely autonomously. And that's what we see here. So, it's really impressive that we're seeing so many companies solve these issues that once plagued normal robotics. And another humanoid robotic startup that exists is Norway based physical robotics. This company is finally out of stealth. This company was founded by Fong Newuen, a co-founder of Hall Robotics, which is now rebranded as 1x Technologies, which is of course the company that founded the Neo robot. Now, he was chief science officer at Hodi or 1X Technologies for 8 years, leading the development of the Eve robot. And he's started this new company, and their mission is to create a generation of robots that live in harmony with humans in the physical world, enhancing the quality of human life. And last week, the company announced the closing round of $4 million. We've also got this, which is super interesting. So, Kyber Labs, often referred to in coverage as Kyber Lab Robotics, is a Brooklyn-based New York-based robotics startup building AI native manipulation platforms with a particular focus on highly dextrous hands and artificial muscle actuators. Now, this company was actually founded in 2022 and it was founded by a small team including a former SpaceX Machina Labs engineer and an industrial designer from electric motorcycle startup Tar and it positions itself squarely in the embodied AI manipulation space rather than classic humanoid walking robots. Now they're basically building bimmanual robotic manipulation platforms which are two arms and hands designed from the ground up for AI based control and reinforcement style training. Now their core hardware uses lowcost artificial muscle fiber actuators that mimic human muscle giving the system backable motion that can handle delicate tasks that you're seeing like threading or nut tuning and high impact events like being hit with a tool without fragile gearboxes. Now, their focus is on robotic hands. Much of the public demos that we've seen so far are of a high-speed realtime and feel contact through motor currents rather than relying on f fingertip tactile sensors. And the hand emphasizes back drivability and torque transparency that external forces move fingers naturally, making it better suited for learning based control and safe physical interaction with objects and people. Now they explicitly are targeting high mix low volume manufacturing and assembly work that is hard to automate with traditional fixed jigs and rigid industrial arms. The idea for this company is to provide a generalpurpose AR ready manipulation platform that can be dropped into real factories and warehouses to handle varied tasks in unstructured environments. Now they see themselves as building robots designed for AI, arguing that current industrial robots are too stiff and too brittle for large scale reinforcement learning and AI enabled training loops rather than prioritizing walking humanoids. They are emphasizing the economic value of humanoids is mostly in the hands and their road map centers on

### Segment 8 (35:00 - 40:00) [35:00]

dual arm human level dexterity with strong integration of learning based control. Now, we also have Eggy, which is a new competitor in the humanoid robot race. A new startup called Tangible has come out of stealth with Eggy, a wheeled humanoid robot launching the same day the industry was actually reacting to the Sunday robots that you saw first. Now, Eggy is an sleek, friendly looking wheeled humanoid robot designed for everyday home use. This demo video shows Eggy wiping a kitchen counter, signaling real world manipulation ability. Now, this was built by a team led by MIT PhDs uh focused on robotics and AI. Now, the big difference is that they're focused on five fingered hands unlike Sunday robotics. Now, once again, this Eggy robot is actually a wheeled robot. So, a wheeled humanoid. And I think this might be an interesting, you know, thing moving forward. Now I think the difference you before so now the difference between Eggy and the Sunday robot because a lot of people were comparing these two on Twitter is that there's five fingered hands. So now the big differentiator that we can see between these two robots is that one has five fingers and one has a very simple hand. So people were debating you know the differences on Twitter but their philosophy Eggy's philosophy is a full stack touch. So tangible is vertically, you know, vertically integrating everything, the hardware, the software, the control, the data, and they're focusing on dexterity, compliance, whole body control. And their goal is basically robots that thrive in uh, you know, unpredictable, cluttered home environments, not just structured labs. So they basically say that robots aren't just embodied AGI. They are the truest forms of AGI. What is AGI without touch? So you know, they collected data. you know engineers were wearing sensor heavy e exoskeletons to record the tactile data like how hard to press how hard to grip fragile versus rigid objects and this is aimed at capturing the subtle you know physical intelligence needed for home tasks so it basically joins the entire field of rapidly growing home robotics field I think there's you know two fields in robotics the ones that are you know going to be in factories and the ones that are going to be in our homes personal for us and that's where you know tangible is bringing this robot to now there's also Also, also Agile Robotics. This is a robot from Germany and they are stepping into the humanoid big leagues. So, Agile Robots is a Munichbased robotics unicorn with deep German aerospace center roots and has officially entered the humanoid race with Agile One, a bipedal factory focused humanoid built for real industry work, not demos. This thing is 174 cm tall, 69 kg, and the payload is 20 kg. This is designed specifically for warehouses and factories, navigating narrow aisles, and working directly alongside human staff. The big selling point is they have the world's most dexterous hands. They are going allin on dexterity, sensors in every joint, fingertip force, torque sensing, aiming at delicate assembly tasks that normal robots struggle with. And this places agile robots right in the humanoid hand arms race. So other companies are taking different paths. Tesla is 50 actuator superhuman precision, you know, hands. Boston Dynamics is a simple three fingered gripper. If you've ever seen it, it's pretty cool. And then, you know, this company is taking a completely different approach now. Super interesting stuff because we have other things such as them saying that they're only focusing on premium physical AI, the intelligence layer. Agile is, you know, trained on one of Europe's largest industrial data sets and it's actually part of a broader ecosystem called agile core integrating humanoids, robotic arms, autonomous mobile robots. And the goal here is an entire intelligent production system, not just one robot. So their you know DLR legacy and the key thing here is that Agile will be you know fully manufactured in Bavaria starting in early 2026 you know tight quality control less outsourcing than competitors and Germany is quietly becoming a powerhouse in cognitive robotics and they're entering a market where neurobotics already exists. So it's going to be interesting to have some very interesting competition. Now lastly we do have PI AI which is incredible. This demo is just insane. Please watch this. — Hi Laura. I'm a researcher at PI. We wanted to see whether we could get a robot to actually help us operate an inoff coffee bar. In order to start this experiment, we went to a local coffee shop and worked with professional baristas in order to understand what the robot could do alongside a person to be helpful. One of the aspects that makes this really difficult from the low-level control side is tool use with a lot of precision. So, picking up that porter filter, inserting it, aligning it, and locking it in. In addition to that is the liquid handling for this task. So the robot needs to be able to pour without

### Segment 9 (40:00 - 42:00) [40:00]

spilling and that requires a really smooth and steady hand. Now the robot can make all sorts of drinks like Americanos, lattes, double shot espressos. Um, and it's keeping the office all very caffeinated. — Here's your latte. — Thank you. — To understand how to make our models truly useful, we partnered with Dandelion Chocolate to explore automating some of the repetitive tasks in production. So, the reason we chose box building is it's actually not just something that's valuable to Dandelion, but it's a really great test bed for our method. In the past, automating a task like this would require specialized, expensive machinery that often doesn't make economic sense. What makes box building challenging is that for it to be useful, it needs to run for many hours without human intervention. Any mistake along the way could mess up the box, mess up the label, and send the robot right back to the start. By training merely on teleoperated demonstrations, we tend to produce policies that fail fast and struggle to recover. By learning from experience and guided corrections, we are able to teach the policy how to recover from difficult situations and avoid taking risky actions that might increase the chance of failure. Hey, what's up? To make robots useful in the real world, they need to be able to go outside of the lab. People need to be able to bring them into their homes and have them actually do useful things. And in this experiment, we wanted to test exactly that. Take it into a home it's never been in before. give it clothing items it's never seen before and see if it can fold them effectively. A lot of our prior models learn from supervised data or demonstration data. With our new approach, we wanted to see if we could have a model that could learn from its own experience. And doing this allows it to see its own mistakes and learn how to correct them. And we found that for this laundry task, these models that are trained with their own experience tend to be more decisive and have more successful outcomes as a result. We're super excited about this experiment that we ran. We found that we could just bring a robot into a home it hasn't been before, clamp it onto the table, and it did a very reasonable job of folding a lot of different unseen laundry items. Things are still far from perfect, but we're really optimistic about the progress we've made so far. — And then something I found that was really funny was the world's first robot dunk. Not dunk, but someone blocking the shot of a robot. It was crazy. It was just crazy. There's no other way to describe this. So, I don't know how he taught a robot to, you know, use a basketball and dunk. If you go onto his page, it's super interesting all of the simulations he's been doing. He's actually been working on this for quite some time. And then he actually blocks a robot from getting a free throw. So, I don't know. This is just pretty funny. So, let me know if you guys enjoyed this video. I'll see you next
