Metas NEW Secret Robotics Project Is Finally Here!
8:41

Metas NEW Secret Robotics Project Is Finally Here!

TheAIGRID 08.11.2024 13 160 просмотров 355 лайков

Machine-readable: Markdown · JSON API · Site index

Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI
Описание видео
Prepare for AGI with me - https://www.skool.com/postagiprepardness 🐤 Follow Me on Twitter https://twitter.com/TheAiGrid 🌐 Checkout My website - https://theaigrid.com/ Links From Todays Video: https://ai.meta.com/blog/fair-robotics-open-source/ Welcome to my channel where i bring you the latest breakthroughs in AI. From deep learning to robotics, i cover it all. My videos offer valuable insights and perspectives that will expand your knowledge and understanding of this rapidly evolving field. Be sure to subscribe and stay updated on my latest videos. Was there anything i missed? (For Business Enquiries) contact@theaigrid.com #LLM #Largelanguagemodel #chatgpt #AI #ArtificialIntelligence #MachineLearning #DeepLearning #NeuralNetworks #Robotics #DataScience

Оглавление (2 сегментов)

Segment 1 (00:00 - 05:00)

so you know that company meta the one that done L 3 and was absolutely amazing they've actually done something even more amazing with their company so they're actually releasing a new research artifacts that are sporting their goal which is essentially AGI but they're calling it Advanced machine intelligence or Ami for short and this is basically about how their new division is going to be working on Robotics and it is pretty cool hi I'm Mike and I'm Mustafa and we're working on touch perception meta our metair team is excited to release several cuttingedge developments in robotics and touch perception that Mark a significant step forward in our understanding of the physical world and our ability to interact with it in meaningful ways let's begin with spur is the first general purpose touch representation that works across any tactile sensor and any task to achieve this we train SP using 400 60,000 tactile images with self-supervised learning in benchmarks we find that spur outperforms task and sensor specific models by an average of over 95% spur unlocks pre-trained backbones that allow us to measure properties like forces and contact which are imperceptible through vision our aim is to empower the community to build and scale these models towards novel applications which leads us to digit 360 a breakthrough tactile sensor that perceives the world with human level touch sensing capabilities with on device a digit 360 processes information locally and makes decisions inspired by The Reflex function and people and animals this allows a sensor to respond quickly to various inputs such as the Poke of a needle or the flex of a tennis ball it responds quickly to various inputs mimicking the performance of a human finger making it an Innovative solution for multiple applications we've also paired digit 360 with digit plexus a platform that standardizes robotic sensor connections and interactions mimic the human hands touch information processing for embodied AI digit plexus provides hardware and software components for easy integration of touch sensing technology into robotic systems finally we are partnering with gelite and wanic Robotics who are leaders in the industry to develop and commercialize these T sensing Innovations our hope is to bring the community along and work together in driving the progress in Ai and Robotics Beyond robotics these advancements can help bring the physical world and the digital world closer together imagine picking up an object in a game or scanning a piece of fabric during online shopping this could even lead to advancements in Prosthetics our hope is to build a driving community and ultimately make touch sensing practical out there in the real world this work brings us one step closer to understanding what it means to digitize touch and that future is almost Within Reach This is actually pretty fascinating because we actually got to see for the first time one of these companies actually focus on something that isn't being focused on in terms of the mainstream robotics what we're seeing here is a company that is focusing on touch perception now this doesn't seem like much at a glance like you might not think that this has wild scale industry use and applications but this is one of the things that I think is going to connect us to robotics in a completely different way and always I would say that meta is always focused on areas that most people just simply Overlook so the touch perception thing is absolutely crazy because it's going to allow next level robotics in human interaction that understands exactly what it's interacting with a lot of humanoids we see today cool they have nice hands they're able to touch things but they don't know exactly what it is that they're engaging with unless they're of course using a vision system and if you have this kind of touch sensitive area you know quite like human hands we literally are able to prick ourselves with needles and we're able to grab things and feel the kinds of sensitivities which is really important with engaging with the actual physical world it's all well and good having a robot armed that's able to you know grab things but if it doesn't understand exactly how gentle it should be or how much strength it needs to use or how much pressure or how much sensitivity or even for example how hot something is because there are many different times that you know when we touch things that are completely hot we immediately remove our hands and do robots do that I would argue that isn't something that's currently baked in now it's pretty crazy The Meta are focusing on they're actually focusing on this as well as a lot of other areas when it comes to Ai and I'm truly excited about this stuff now one of the things that they actually spoke about in their blog post which is something that I didn't even know meta had but it is a sign that they are going to be moving forward when it comes to their robotics Department in a way that we haven't seen report so in the video they actually talked about how they're going to be partnering with some companies and one of the companies they actually spoke about was this company called partner so it says as we move closer to a future with intelligent robots and advanced AI models capable of Performing everyday household chores it's important to consider their

Segment 2 (05:00 - 08:00)

interaction with humans and that's why we're releasing a benchmark for planning and reasoning tasks and human robot collaboration partner which is designed to study human robot collaboration and household activities and basically said here okay what they're going to do is they're going to be training and testing social embodied agents on physical Hardware with actual human Partners in this environment so of course you know that like training robots is actually very hard and a lot of times what we do is we actually train these robots in a simulated environment that's exactly what they're going to be doing so they're going to be using something called habitat 3. 0 which is a highly realistic realtime simulator that supports both robots and humanoid avatars that allows for human robot collaboration in homelike environments with the goal of testing physical world scenarios which is really incredible so essentially habitat 3. 0 does seem a little bit weird but this is a essential scenario that companies need to place themselves in because if you're going to be deploying humanoid robots at scale you need to have a system that allows you to test and ensure that these robots are safe so these environments are basically similar to real household and the researchers you know are going to be using this software to study embodied AI agents that can navigate interact and assist in the same environment as humans while addressing major challenges in the real world and of course this is something that I think is really important now once again okay having meta on this area where they're going to be looking at areas for realistic embodied agents in Virtual environments for humans is going to be something that is really essential for us to advance in that area there's also something that I think this enabled one of the things kind of Technology actually does enable is of course Prosthetics now most people aren't going to be thinking about Prosthetics because the majority of us aren't disabled we do have all of our Limbs and we are fortunate enough to be able-bodied now of course those who are unfortunate enough to have to wear a prosthetic one of the things that I would say would definitely help them is of course the ability to have digits 360 baked into a prosthetic limb because this is going to allow them to have realistic touch sensation with over 8 million taxels capturing detailed tactile information the digit 360 can actually provide these individuals with nuanced touch sensation and this actually means that prosthetic limbs can relay information about pressure texture and even the subtle Contours of an object directly to the user and of course this will have temperature detection so the ability to sense and perceive temperature changes which is crucial for safety and Performing tasks that require temperature sensitivity and you're going to be able to have vibration awareness detecting vibrations helps users interpret environmental cues such as the operation of Machinery or the texture of surfaces while running your hand over them and of course this is going to allow for improved motor control and dexterity when you have an on device AI That's baked into this that enables the prosthetic to process sensory information and react in real time this allows for immediate reaction speeds quite like how humans have it and allows for immediate adjustments in grip movement when interacting with object and this is also going to allow fine manipulation detailed tactile feedback enhances the user's ability to perform delicate tasks such as picking up small objects typing or handling tools by providing precise control over these movements and think about it like this okay right now we're not in the craziest place when it comes to Prosthetics or bcis which are brain computer interfaces but imagine when combined with bcis how the digital c6d is going to be able to facilitate more natural control of the prosthetic limb by directly interpreting neural signals and providing sensory feedback to the nervous system overall this seems to be something that's going to be absolutely incredible and I can't wait for this to be baked into the Prosthetics so that individuals can have a better quality of life and of course that we can advance

Другие видео автора — TheAIGRID

Ctrl+V

Экстракт Знаний в Telegram

Экстракты и дистилляты из лучших YouTube-каналов — сразу после публикации.

Подписаться

Дайджест Экстрактов

Лучшие методички за неделю — каждый понедельник