So CES 2026 is here and in today's video I'll be showing you guys the top AI and robotics announcements from the past two days. So coming in at number one is the Boston Dynamics humanoid robot. So at CS 2026, Boston Dynamics officially graduated its famous Atlas robot from being a viral video star to a realworld worker. While we've seen the prototypes of the Atlas before, this was the actual full reveal of the official product version, the one that companies can actually buy to work in factories. Now, the reason that this actually matters is because number one, it has a superhuman movement. The Atlas, if you've ever seen it move, it doesn't move like a traditional human. It moves like a video game character with cheat codes. So, one of the cool things that it has is, of course, these 360 rotating joints. Most of the joints can spin in full circles. So if it needs to pick something up or, you know, look behind it, it just doesn't have to turn its feet. It just spins its waist and its head 180°. Now, it does look super weird, but it is super useful to have a robot that is able to just turn with the continuous joint rotation because it means it's just remarkably more efficient, more effective, and smoother to do more of these tasks. And you can see that it does have a self swappable battery for continuous operation, meaning that these things are going to be able to get a ton of work done in our economy. Now, this robot specifically has 156 degrees of freedom. It's basically saying that, you know, this is the robot that has 56 different hinges or ways to move. And for comparison, a human arm and hand have about 27. So, it now actually, you know, in the new version of this Atlas robot, it has these tactile sensing hands, which means it can actually feel how hard it's gripping something. So, it's gentle enough to, you know, pick up smaller car parts, but it's strong enough to lift around 50 kg or 110 lb. Now, the biggest news about this robot, which is really cool, is that they are getting a brain upgrade. So, I'm pretty sure most of you guys know who Google Gemini are by now. And at CS 2026, it was unveiled that Boston Dynamics partnered with Google DeepMind. So now because of this Google Gemini brain upgrade, it now means that instead of a programmer writing thousands of lines of code, a manager can actually talk to this robot and say, you know, find the blue crates that are out of place and stack them by the door. And then the robot's actually going to use the onboard AI to figure out which crates are blue and where the door is. Now, because of course it's using the Gemini Robotics AI, it can also solve problems on the fly. So, as you can see right there, as a person walks in its path, it doesn't just stop, it will recalculate a new route or wait for them to pass safely. Now, unlike the old versions that were a bit fragile, the product version is, of course, as I said before, it's built for work 24/7. So overall, remember that this is a robot that is fundamentally going to change factories and never has to sit and charge while it
could be working. And it actually has an IP67 rating to my surprise, which means that it's actually dust and waterproof. So you could literally hose it off if it gets dirty in a factory or it could work in the rain. So, additionally, if we're talking about how extremely good this robot is, not just from, you know, in terms of how good it looks and how good it moves, it actually can work in extreme temperatures ranging from -4 Fahrenheit to 104 Fahrenheit or -20° up to 40°. Means it's perfect for cold warehouses or hot factories. Now, you probably won't see one in your house for a long time as these are focused on, you know, mainly factories. And all of these robots that have made this year have already been sold to Hyundai and to Google. So, if you're thinking, is this like, you know, the other home robots? It's not like those at all. So, staying on the topic of robotics, we had LG introduce their humanoid Cloid, pronounced Khloe D or Cloud Eyee, and it's actually a part of their vision for zero labor home. So, this is, you know, their idea of a future where you never have to do chores again. So, imagine this as a mix between your personal assistant, a butler, and a very smart table on wheels. Now, if you're wondering how this moves, unlike those other humanoid robots and the Boston dynamics we just saw, this is designed to look friendly and fit inside a normal house. You can see that the, you know, instead of the wheels, it's we can see that in terms of the body shape, this is something that is a little bit different to traditional robots. So, with the arms and the hands, this is the impressive part where it has two arms that moves exactly like ours. Seven degrees of freedom, meaning it can reach, bend, and twist just like we can. And each hand has exactly five fingers just like we do. So these fingers can help move things independently, allowing it to pick up delicate things like a glass or a folded towel. Now the robot's middle section can tilt or adjust its height, which is allowing it to, you know, reach those higher counters or lower shelves. And most interestingly, the part that really is, you know, really fascinating to me because a lot of robots don't opt for this is, of course, the wheels, which are easier to balance. The wheelbase makes it much more stable, so it won't tip over if a dog runs into it or a kid pulls on it. And I think that design choice for home humanoid robots is becoming more and more evident as I've seen more robots choose this form factor. Now, for the face, it's basically a high tech screen that displays digital eyes to show emotions or information, which is really cool. And it also houses the cameras and sensors to see. And I believe that if you do look closely at the hands there, you can see that there are some onboard vision cameras so that it can navigate the tasks appropriately. Now LG calls the brain of this humanoid robot affectionate intelligence because it's meant to be helpful and caring, not just a cold machine. So apparently Cloyd D uh it doesn't just, you know, follow orders. Apparently it watches and learn. So if it sees that you usually go for a run at 4 p. m. but it starts raining, it might roll up to you and say, "Hey, it's raining outside. want to do an indoor workout instead. And it uses something
called physical AI, so it can, you know, look at a messy room, identify what a shirt is versus a plate, and figure out the exact movements with needed to pick them up without breaking anything. So, since it's actually an LG product, it connects to all your other appliances, and it can tell the oven to preheat, check the laundry if it's dry, or see if you're out of milk by talking to the fridge. So, I think that's probably the biggest, you know, selling point of this device is that it actually links to your other devices. Otherwise, I'm not sure that this entire humanoid thing is going to be as useful as we'd hoped. If you do have a humanoid robot that's able to talk to all of your smart home appliances, I'm pretty sure that's going to make you remarkably much more efficient. And of course, in terms of the actual demos, we did get to see the LG humanoid robot taking clothes out of the dryer, folding them, stacking them, and in some other demos, it was shown fetching milk from the fridge, putting a croissant into the oven to start breakfast. And the only bad thing about this is that this is currently still a prototype, meaning that you cannot go and buy one at the store tomorrow. LG is currently still testing this to make sure it's 100% to leave alone in the house and be safe. and LG partnered with Nvidia to build the AI brain inside this humanoid robot using some of the most powerful computer chips in the world. So, it'll be super interesting to see what the final version looks like if they manage to release this version, a second version, but still very interesting. Now, if we get into AI healthcare, there were a few things that were really interesting. So, this is a concept product from Omnia, and they unveiled this futuristic concept, which is a 360° body scanning health mirror. So, here's the breakdown of how it works and why this is kind of like a big deal. Well, the thing is that this is not just a piece of glass. The Omnia system has, you know, two main parts. You've got the mirror, which is a fulllength high-tech mirror that doubles as a giant computer screen. Then you've got the base, which is a floor mat/scale that you stand on that's packed with many advanced sensors. And together, they perform a deep health screening in about a minute while you're just standing there getting ready for the day. So, if you're wondering how this works, when you're standing on the base, it sends tiny safe electrical signals through your body and you won't feel a thing. And this is called bio impedance. And it's how the mirror sees inside of your body. So, the mirror shows a 3D version of you on the screen. It's kind of like a video game character version of yourself where you can see where you're gaining muscle or losing fat or even how your posture looks. So, it doesn't just text your weight. It actually measures things like your vascular age, how old your heart and arteries are, and your ECG, and even how much water is in your cells. So, the mirror has an AI vocal companion. Think of it as a much smarter, more empathetic version of Siri or Alexa that actually talks and knows your medical history. So, you can ask it, "Hey, why am I so tired today? " And it might look at your sleep data from your watch and your heart rate from the mirror and say, "Your recovery heart rate was low last night. Maybe take it easy today. " And it gives you daily goals. based on your stats. So if your metabolic health is a little bit low, it might suggest a specific workout or
an extension of the phone intended to reduce screen fatigue. So for those of you that want more time in nature, but still want to have that connectivity, maybe this is going to be something that is useful. Now, one of the things that they also said is that by offloading much of the heavy processing to the paired phone or to the cloud, the device remains cooler and smaller than standalone AI hardware, potentially going to be rolling out in late as 2026. Interestingly, Motorola has not yet announced a consumer price or a specific release date, but if this does come out, I will give this product a review and let you guys know how it does. Now, this one is super interesting. This is the Razer AI project ava. Now, this is essentially Razer's AI desktop companion, and this one's pretty hilarious and pretty memeable, but at the end of the day, it is a product that I think people may buy. So, this is a small device that you plug into your Windows PC, and it's going to show you a 5. 5 in animated avatar, which looks like it's standing inside the device. Now, you can essentially talk to this, and this is going to talk back. It's basically like an assistant for gamers. Now, Razer has decided to position this as a gaming coach, which is going to be able to help you improve, you know, learn mechanics or give tips or a general life/work assistant for schedule planning and brainstorming or a desk companion with personality so you can pick different avatars and attitudes. Now, the reason they're calling this a hologram, it's not like a Star Wars floating in midair hologram. It's a special kind of display inside a little enclosure that makes the 3D characters look 3Dish from the outside. So, from the public info, what we can confidently say is that it's described as an animated 5. 5 in 3D hologram/holographic avatar on your desk. Now, what's not clear about this product yet is, you know, the exact display t that they're using, but the effect is real and visible, but the hologram here is more marketing shorthand for 3D looking avatar in a little display. Now, how does it know what is going on in your environment? Well, this is the part that makes it feel like a smart companion instead of a little talking toy. So, it's described using microphones like a camera and vision features to understand context. I mean, the Verge demo mentions a built-in webcam that can watch you and your screen and a mode Razer calls PC vision mode, which analyzes what's on your screen with very low delay. So, of course, if you've got a wired USB connection with a Windows PC, the Razer says the wire is for high bandwidth data, so it can see and respond quickly. So, the pitch is that it's not only answering questions, it's also in theory reacting to what you're doing on your PC. Now, I think one of the most interesting things is that how this product is marketed because a lot of people are saying, what can this even do? I mean, Razer markets this as a do everything assistant, but for the realistic breakdown, number one, it's, you know, main thing is for gaming help. So, Rager says that it's a coach/trainer and on name and the idea is that you ask for help. How do I get better at X? And it's going to give you the tips, the strategy, the training, the suggestions, and it's meant to stay within the game rules. Of course, you know, Razer
explicitly addressed the cheating concerns. Apparently, it's also going to be helping you at work and at life. So, it's going to be helping you organize your plans, you know, dinner plans, helping with professional tasks. And the personality angle, which I think is probably the biggest angle, is that it has avatars. So they've got Kira, Zayn, and the avatars do eye tracking, facial expressions, and lip sync to seem alive. And that's why it's getting called the AI assistant. So in the demos, people of course talk to it. They have a key bind where it, you know, comes alive. And right now, if you do want this product, you can reserve it for $20. It's refundable completely. So this is just towards the final price. Right now, it's expected to come out in the second half of 2026. And of course, we don't know how much this is going to cost. Once again, if you guys do want me to actually get this product, review it, I will be happy to do that because it does seem super interesting. And I do have a few Razer products. So, so far, they've done really well with the products that I do have. Now, this thing also does run on Xi's Grock. So, all of these characters and avatars you've seen in XA's Gro will be powering these little assistants. So, it's going to be super interesting to see how people evolve with these assistants. I mean, we don't really know how humans are going to evolve with AI just yet, but it's going to be super interesting to see how that relationship evolves. There are many different characters, many different companies. Will be super interesting to see how these characters move around. Now, Razer didn't just give us one AI gadget/gmo. They gave us Project Mokco, which are headphones and microphones and cameras and a Snapdragon chip plus AI models. So, you can basically talk to it hands-free and it can respond based on what's happening around you. So, Razer decided to make this as a headset and not glasses, which is super interesting because Razer's pitch is basically, you know, headphones are already normal. Tons of people wear them daily, and they're less in your face than the smart glasses, and they're a very practical place to hide the sensors and the mics needed for an AI assistant. Now, what makes this AI is the cameras in the ears. Yes, really, there are And MCO has two firstperson cameras in the ears, you know, around the ear cups, so it can, you know, recognize the objects. Of course, you guys can see right now it's able to understand what you're looking at. Basically, think of it as, you know, your headphones essentially get eyes. Now, Razer also describes using multiple microphones. So, you can hear your voice commands. It can pick up, you know, dialogue nearby and it can understand environmental cues to understand exactly what's going on around you. Now, it's powered by a Qualcomm Snapdragon chip inside. We don't know which one it is, but it's supposed to be always on, like an always on assistant. So, you can say, "Hey, what's that? Summarize this. " And it's able to reply in your ears. And Razer is essentially marketing this as basically something that's a wireless AI vision headset that is just context aware and it's helping you be more effective. Now, Razer say that it's meant to, you know, interact with popular AI systems like OpenAI, Google, and Croc. And it's basically a front-end wearable that can talk to many different AIs. It's not just locked into one assistant. So depending on how this, you know, product
is configured, whether the vision and the audio could enable things like once again translating a sign that you're looking at, summarizing a document, identifying an object, explaining it, and of course coaching you with certain tasks. So it's going to be super interesting to see how this works. Of course, probably going to be some privacy concerns, but I will be super intrigued to see how this product actually, you know, works moving forward and if it is something that is widely adopted. Next, we're going to be taking a look at another AI product. So, this is the AI TV. So, at CS 2026, Samsung didn't just release a smarter TV, they basically turned the TV into a vision AI companion or VAC for short. So think of this less like a screen that you watch and more like a massive iPad that lives on your wall and understands your life. So the coolest part isn't the screen, it's the, you know, VAC, which is the software and the Samsung integrated AI models like Microsoft and, you know, Perplexity are going to be directly inside of that TV. So it has a conversational assistant. You don't just search for action movies. You can say, I'm in the mood for something like Stranger Things, but I want that set in space. And it's actually going to, you know, think and give you specific suggestions. It also got, you know, contextual awareness. So, it knows if you're watching a cooking show and if you see a sandwich that you like, you can ask, "Okay, how can I make that? " And the TV will find the recipe and send it to directly your smartphone or your fridge. Now, this is a mega screen. The flagship model is, you know, 130 in. That's completely crazy, which is around 11 ft diagonally, which would cover almost an entire bedroom wall. So now if you want to compare this to traditional TVs, traditional TVs use a backlight that shines through colors. This TV uses microRGB technology where millions of microscopic red, green, blue, and LED lights are the light. Now, this is better because it gets perfect blacks of an OLED because it can turn the individual pixels completely off, but the insane brightness of a stadium screen, which is bright enough to see clearly even if the sun is hitting it directly. Now, there's a specific feature for sports fans. So, if you watch soccer or football, the AI can actually identify the ball, the players, and the crowd noise separately. So, you know, if you've ever like hated the guy commentating on the game, you can tell the TV, mute the commentator, and the AI will filter out his voice while keeping the stadium cheering and the sound of the ball kicking perfectly clear. Now, it uses also a new system called Eclipser Audio to make it feel like sound is coming from all around you, not just at the bottom of the TV. And if you're watching a YouTube video or a show in a different language, the TV can apparently translate it in real time on screen using the AI, similar to how live translate works on high-end smartphones. Now, the TV supports 4K at 165 hertz. That is crazy. So, for a gamer, that means your PS5 or PC will look incredibly smooth with zero lag or motion blur. And it even uses AI to upscale old blurry games to make them look like they were made in 2026. Now, crazily, this is future proof. So, you're going to get 7 years of updates with this product. Usually, TVs get dumb
after a few years because software updates stop, but Samsung promised 7 years of upgrades. So, that means in 2033, your TV is still going to have the latest updates and features just like an iPhone or Galaxy does today. Now, this one isn't actually a new CES release, but it was going pretty viral on Twitter, and a lot of people were talking about this product as if it was released at CES. And the reason they are talking about this product is because number one, it actually rivals something that OpenAI are actively trying to create. And at previous CES releases, it has done previously very well. So this product is called the Noah pen. And essentially this is what OpenAI is trying to build. So remember how I made a video, maybe you saw it, maybe you didn't. It was a few days ago where I spoke about how OpenAI is trying to build an AI powered pen, which is essentially a kind of AI pen that captures your writing. So instead of scanning the page, the newer pen tries to capture your writing while you write using a combo of, you know, cameras near the tip, motion sensors, pressure sensing. Think of it as your pen doing some, you know, really cool calculations to figure out what you wrote and then it's able to upload that to the newer app. So after you write something on a, you know, traditional piece of paper in the newer plus app, you can have that immediately transcribed. You can also search through it. You can also organize it. And it has an AI assistant feature, so you can ask things about your notes depending on the current features. So instead of losing your notebooks or forgetting where they are, you basically get a digital brain of your handwritten stuff. Now, I think this is super interesting because this is really cool. So the pen size is around, you know, 28 g, 143 or 10 mm in, you know, the dimensions. And it has a standard D1 ballpoint ink cartridge. So basically like a normal pen. So it's not some weird kind of, you know, strange pen. Crazily, the price of this is around $299 or $349. And I think this is going to be either a hit or miss for some people. So, for those of you who just completely love writing but hate losing notes and you want that paper freedom and you want to be able to search through that, this is probably going to be one of those niche products for you. Now, the reason I think this is once again very relevant is because like I said, we know that OpenAI are really trying to create a device that is almost exactly like this. So if you really want to know what OpenAI are creating, maybe you should even buy this product, take a look at it, the reviews, and you'll be able to see what OpenAI are probably going to release sometime later this year. Will be super interesting. Now, we also got Nvidia Rubin, which is the next big data center deal for AI after Blackwell. It's basically the engine for memory and networking stack that Nvidia wants the world to use to train and run the newest, most expensive AI models. It's named after the astronomer Vera Rubin and the platform is often called Vera Rubin. Now if you think of it like this, Nvidia describes Reuben as a platform made of six major chips that are designed to work together as an AI supercomputer. You've got the Vera CPU, the Reuben GPU, Nvidia NVL link, and you've got a bunch of other things that make this entire Reuben thing come together. So, so when people say Reuben, they mean the whole data center AI