Metas New AI App Is Set To Rival ChatGPT - Your New Personal Assitant
11:08

Metas New AI App Is Set To Rival ChatGPT - Your New Personal Assitant

TheAIGRID 04.05.2025 13 082 просмотров 315 лайков

Machine-readable: Markdown · JSON API · Site index

Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI
Описание видео
Join my AI Academy - https://www.skool.com/postagiprepardness 🐤 Follow Me on Twitter https://twitter.com/TheAiGrid 🌐 Checkout My website - https://theaigrid.com/ Links From Todays Video: Welcome to my channel where i bring you the latest breakthroughs in AI. From deep learning to robotics, i cover it all. My videos offer valuable insights and perspectives that will expand your knowledge and understanding of this rapidly evolving field. Be sure to subscribe and stay updated on my latest videos. Was there anything i missed? (For Business Enquiries) contact@theaigrid.com Music Used LEMMiNO - Cipher https://www.youtube.com/watch?v=b0q5PR1xpA0 CC BY-SA 4.0 LEMMiNO - Encounters https://www.youtube.com/watch?v=xdwWCl_5x2s #LLM #Largelanguagemodel #chatgpt #AI #ArtificialIntelligence #MachineLearning #DeepLearning #NeuralNetworks #Robotics #DataScience

Оглавление (7 сегментов)

Intro

So, Meta just introduced a new AI app that is set to revolutionize the way we interact with AI and social media. In today's video, I'll be giving you guys the complete rundown of everything you need to know. Take a listen to what Mark Zuckerberg says about his own AI app, and then we can dive into the really specific details. All right. Hey

Meta AI

everyone, we built a new thing for you. Uh, there's almost a billion people who are using Meta AI across our apps now. So, we made a new standalone MetaAI app for you to check out. Uh, Meta AI is designed to be your personal AI. Um, that means, you know, first it's designed around voice conversations. You open up the app and you can talk to it about whatever you want. Um, from the news to an issue you're dealing with to um just anything that you want to learn about. It's also designed to be personalized. Uh we're starting that off really basic with just a little bit of context about your interests, but over time you're going to be able to let Meta AI know um a whole lot about you and the people you care about from across our apps if you want. Um we also designed this social feed so you can discover all kinds of different ways that uh people are creating stuff with Meta AI. It's really quite fun to check out. And also um you're going to be able to use the app to manage your meta glasses and um other kinds of AI devices that we're going to be building in the future. So anyway, this is the beginning of what is going to be a long journey to build this out. Uh but you know, go check it out. Let us know what you think. So one of the first key things that they actually talk about here is the fact that this AI is going to have incredible memory. And I think that's a really important feature for any AI if we're supposed to evolve with those AIs over the future. So we wanted to focus

Memory

on pushing the limits and a fresh take on how people could use AI. We were very focused on the voice experience, the most natural possible interface. So we focused a lot on low latency, highly expressive voice. The experience is personalized, so you can connect your Facebook and Instagram accounts, and the assistant will have a rough idea of your interests based on your interaction history. It will also remember things you tell it, like your kids' names and your wife's birthday and all the other things you want to make sure your assistant doesn't forget. So coming back to the memory thing, I think once again it's remarkably important to have a memory feature because oftent times people don't realize the ability to have a conversation with an AI that knows things about you. The conversation depths are a lot more intense. They're a lot more thoughtful and there are a lot more nuances that the AI has that makes the conversation a lot more enjoyable. Now, I do think that this memory feature is part of the AI community's wider part to ensure that individuals are using their tools for longer. And of course, having memory is definitely one of those. Now, especially with memory as well, you have to understand that Meta is really building an incredible app and ecosystem. So, when you have memory built into an AI, you are much less likely to leave that AI. So, it's really important that, you know, these companies are building that feature in that way over the long term, it's going to be more of a hassle for you to switch from your base AI that has all of your memories, your preferences, then to a completely new AI system because I think it's going to be quite like an operating system in the sense that some people are iPhone and some people Android. And, you know, most people probably won't switch over the course of their lifetime. We

Voice

added an experimental mode which is full duplex voiced. This means it's trained on native dialogue between people in speech. So rather than the AI assistant reading a text response, you're actually getting native audio output. And full duplex means that the channels are open both ways so that you can hear interruptions and laughter and an actual dialogue just like a phone call. So it's pushing the limits what we think of what's possible with a natural voice experience. It's early. So the duplex mode doesn't have tool use. It doesn't have web search, so you can't ask it about NBA trades and like the pap conclave. However, it's a very interesting way to see what's possible at the frontier. So, you can see here Meta have also introduced their own voice mode. Once again, I do think that this is going to be the next step in evolution for how people interact with LLMs. Oftent times, yes, you're going to be copying and pasting code and articles and all of that stuff. that is very useful. But for the day-to-day, for the average user to have a great experience, large majorities of people who really don't use AI, the main thing that they're going to be doing is just talking to the AI like an average person. And I think this form factor is probably going to become one of the most widely adopted ways. So meta having this, you know, full duplex mode where you can talk forward and backward from an AI is something that I think is definitely going to put them on the map because there are going to be interactions. They're probably going to go viral. And it also helps to anthropomorphicize the AI in the sense that you can talk to it like it's a person. Now, I'm not going to even mention the loneliness problem that is, you know, sweeping the United States and many Western countries, but I do think that this will be increasingly popular as it becomes even better over time. There's going to be different personalities, different voices. So, this is going to be something that, of course, Meta are trying to push. We also

Discover

wanted to make this fun. And we know a lot of people getting started with AI have no idea what to do with it. And it's not until they see the way that others are using it, doing stuff like them that they get inspired. So whether it's homework or creative writing or AI artwork or code, the community is actually quite mimemetic in that we learn from seeing each other do it. So, we put this right in the app. You can share your prompts. art. It's super fun. We've really been enjoying prototyping this because it makes the experience a lot more creative and a lot more social. And so, with the discover feature with AI, I do think that this is gamechanging. And I've seen a little bit of this with Sora. If you aren't familiar with the Sora Openi platform, there's a little bit of a social aspect there in the sense that you can actually see what other people have created. you can see what prompts they're using and you can you know use other people's prompts to get the same results. So the discover feature I think it's going to be really big because you know for the future I think online communities are going to be something that is really important and one of the you know things that these companies are doing now is they're allowing people to interact with each other when they have a shared goal. So when you generate an image prompt if you have a conversation if you make a video in the you know platform it's going to be really interesting to see how that kind of evolves. And of course, you know, Meta, they already do have Instagram, Facebook, and WhatsApp. So, they already have a billion users, you know, across social media. So, it really is going to be interesting to see how that becomes integrated because, of course, most people do have Instagram and WhatsApp. You know, it's literally just connected to your phone number. So, I do wonder if this kind of social media platform will kind of just evolve as AI does evolve as

Meta Ecosystem

well. The last piece we focus on is pairing with your Ray-B bands. These are incredibly popular. They're the ultimate AI device of today. They're multimodal. You can ask questions about what's in front of you. Again, it's a voice interface. So, the app focuses on taking that and making it coherent whether you're using the glasses or whether you're just using it with your phone. The next thing you can see Meta talk about here is their entire ecosystem. And I think this is probably the most underrated thing. One of the largest form factors for the future is going to be the glasses. And those glasses are actually really good. Like, I've used them for quite some time now. And I honestly haven't found myself using anything other than that when it comes to taking pictures and just, you know, living in a way that's so connected with AI. Like of course, you know, you can have your phone and just message the AI, but having an entire form factor that's already connected to your entire, you know, database. So like I can ask Meta AI glasses, hey, what about this? Can you take a picture of this? And then my chat is going to be able to reference that when I'm having, you know, references and chats with that. That is something that I think is completely underrated when it comes to the glasses. And of course, if you haven't used them, you probably, you know, won't realize just how impactful they are. You know, they are a bit on the pricey side. They're around like $300 or so. But I do think that it will become, you know, the inevitable sort of form factor for people interacting with technology in the future. And Mark Zuckerberg does agree with that. I think

Meta Glasses

glasses are going to be the next major computing platform, but each new platform doesn't tend to just replace the old one, right? So, I guess the version of this that I think about is like you probably have this experience often where you're sitting at your desk and you have your computer there, yet you still pull out your phone to do things. That's true. Okay. So, at some point in the last 10 years, mobile really became the primary computing platform. We didn't get rid of our computers. It's just that even when you have it, you still do more things on your phone. So, what I think is going to happen with glasses is we're going to get to this point probably sometime in the 2030s where you have your phone with you. It's but it's going to stay in your pocket more because you're just going to be doing more and more things on your glasses that maybe today you would do on your phone. You'll reach a point where, you know, just like with your computer, there are probably some things that could be done in a richer way or better in some way on your phone, but you're just going to the glasses will be your main computing platform and that will be kind of your default go-to thing. And then maybe over time you get to this point where people just don't bring their phone with them everywhere. But I think that's really far down the line. Now Meta also do have a canvas feature on the web application. So remember it's not just a you know mobile application. It is currently a web application. I'll be releasing a full tutorial diving into exactly how to use this platform because it is really comprehensive considering that it's basically free for all of the features that you would naturally use in many other AI ecosystems like chat GPT and often times even a little bit more. And one thing I will say is that often times companies don't really get the design language down. Like a lot of times I'm talking with an LLM and things are just designed in a way that isn't user friendly and it's just not intuitive to know. You have to, you know, go through so much documentation. to kind of find things. But um Meta's platform, I will say having the canvas feature there and having other features there, it definitely shows me that they've spent some time designing this web app to actually be really user friendly for even beginners. I mean, when you're publishing something to over 100 million users in just America alone and over a billion users globally, you definitely have to make sure people can use it right off the bat. Now, of course, Meta did also add image gen into their software. And like I said, the design is really, really well thought through. Most beginners won't know any sort of design styles. It's not their fault. Unless you're working in the creative industry, you're not going to really know, you know, all of the different design styles that are really possible. So, this is something that, you know, is really nice. They've got, you know, the visual editor right there. You can literally make images almost immediately. And it's something that is once again really useful for those of you who are just getting into the AI space that want to experiment with tools on Meta's platform and in a way that's relatively extremely cheap. So overall, I do think Meta have done an outstanding job here and I'll definitely be exploring the platform later on

Другие видео автора — TheAIGRID

Ctrl+V

Экстракт Знаний в Telegram

Экстракты и дистилляты из лучших YouTube-каналов — сразу после публикации.

Подписаться

Дайджест Экстрактов

Лучшие методички за неделю — каждый понедельник