Apple Finally Unveils "Apple Intelligence" Ai agents On Device and More
24:31

Apple Finally Unveils "Apple Intelligence" Ai agents On Device and More

TheAIGRID 10.06.2024 12 400 просмотров 342 лайков

Machine-readable: Markdown · JSON API · Site index

Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI
Описание видео
Join My Private Community - https://www.patreon.com/TheAIGRID 🐤 Follow Me on Twitter https://twitter.com/TheAiGrid 🌐 Checkout My website - https://theaigrid.com/ Links From Todays Video: https://www.youtube.com/watch?v=RXeOiIDNNek&pp=ygUKYXBwbGUgd3dkYw%3D%3D Welcome to my channel where i bring you the latest breakthroughs in AI. From deep learning to robotics, i cover it all. My videos offer valuable insights and perspectives that will expand your knowledge and understanding of this rapidly evolving field. Be sure to subscribe and stay updated on my latest videos. Was there anything i missed? (For Business Enquiries) contact@theaigrid.com #LLM #Largelanguagemodel #chatgpt #AI #ArtificialIntelligence #MachineLearning #DeepLearning #NeuralNetworks #Robotics #DataScience

Оглавление (5 сегментов)

Segment 1 (00:00 - 05:00)

so Apple have recently done their WWDC event where they actually for the first time spoke about AI so in this video I'm just going to go through around 10 things that I think are probably somewhat important and then I will give my opinions on some of these things so one of the things that a lot of people were meing about and something found quite hilarious was the generative text messages so taking advantage of the generative image feature which is now prevalent across many different apps and many different chatbots you can see here that Apple essentially has baked in a generative image thing and what's actually kind of interesting is that this image that you're seeing was generated on device so it seems that whatever apple is using is that their recent advances in machine learning and generative AI have essentially allowed them to get to a stage where you can have a lot of things being generated on device and they kind of do this because it allows the information to be a lot more secure so what you have is a situation where you can generate images within your text which is based on of course the image that you see above here and then you do get this image right here so you can basically create images based on what the conversation is about and of course based on the person you're talking to so this of course is a very small feature it's nothing absolutely crazy we also have another example here you can see that they've generated another image based on that person but I guess the only nice thing about this is that you do get some kind of personalized interactions with one another which I guess you could say allows for a decent amount of customization one thing that I am wondering from Apple is that is there going to be any sort of character consistency whereas you know sometimes when you generate an image from a reference so for example if you have this image as the reference and this one right here it actually looks like a really nice image but I'm wondering if that will still be the same because sometimes what can happen is those image references kind of change and I'm just wondering if that is going to be a feature if they've somehow managed to solve that now I think one of the biggest things that Apple actually spoke about and this is probably something that I think is going to be arguably probably the biggest thing for apple and many other phone makers in the future and I wouldn't be surprised if other you know phone makers do take a leaf out of Apple's book for this but this is the Apple actions so essentially this is where Apple's going into their agentic framework if you were paying attention to some of the things that apple had recently done on their research paper you'll know that they've been trying to control the device with an AI agent on screen and have it do a multitude of different tasks so Apple's actions is essentially where you can combine different apps and different actions with a simple text prompt well I say text prompt it's actually a voice prompt so you can see here that you could essentially just say read the song or Play the song The Ray texted me delete my birthday ideas group tab um create a folder called sketches flip the camera email this presentation play the podcast my wife sent to me the other day and I think this is pretty interesting because what we do have is like I said before the first demonstration of what many devices are going to look like in the future as many have said the final version of AI that we do interact with on a day-to-day basis is very unlikely to be one that is essentially a chatbot where we just interface because the future is of course agentic and we're likely to have these things done for us instead of us interacting with them in a current state so I'm wondering if you know other companies like Samsung and you know Android they're all looking at this thinking okay I'm guessing that we're going to have to build some kind of agentic framework pretty similar to this so that we can get these things done and I do hope that this is something that is actually effective because a lot of times when we do have demos the demos aren't as good as the actual presentations so essentially what we do have here as well is that this isn't something that is that crazy because one thing that they didn't actually show us was the fact that you can't actually chain ideas together so you can see right here that every single action is independent so you can move one thing to another thing and then you can try with another action but I'm guessing that there weren't just a million different things because they only showcased a few things that you could potentially do this is something with apple actions that I guess have to

Segment 2 (05:00 - 10:00)

wait until you know this is actually released so we can see how good it is because like I said already a lot of the times when we do see stuff it isn't as good as the demos make it out to be so Apple actions is going to be something that I think once this does take off in further iteration cycles and they do make it a lot better where you can say to it uh I guess you know go ahead and look at these images then do this and then do that I think with chaining I think that is going to be something really does be very effective and this is of course all on device which is powered by their private Cloud compute and this was something that they essentially spoke about where they said that you know all of the stuff that they're processing whichever you know kind of data that you're using whatever kind of things you send to the iOS uh cloud and you know you compute it if it does need the cloud compute and if it's not on device all of that is going to be incredibly and remarkably secure it's a bit boring but they're just basically saying that they're taking a different step compared to other people because they're also allowing the code I believe to be able to be read by other third parties so that you can verify the security of that which I think is actually a remarkable step for Apple because they're not really transparent when they do those things now in further news with the Apple actions what they did have was the ability to of course interact with Siri in a text format so this was quite fascinating because I didn't think we would get this ability and although this doesn't seem like much like interacting with an AI in a text way is something that many people who use AI systems on a daytoday basis have actually done quite extensively but for someone who you know could use Siri I think this actually cuts down a lot of time that you might need to use if you just say something pretty quickly but they didn't really show anything thing if I'm being completely honest with you that was truly remarkable you know how we have you know certain Tech demos where we see something and we like oh that's actually really cool it was just honestly things like setting the alarm checking the weather so I have to be honest and state that this apple actions things whilst yes it might be good for a few handsfree things that you couldn't otherwise do so Siri is of course becoming a little bit more useful than it initially used to be because Siri is pretty useless other than really basic things I think this is something that they will have to build on at next year's WWDC because right now it seems to be in its pretty early stages and they also had something right here which is where you can ask Siri about your device so I'm guessing that this is something that is just based on a very large document that is somehow either hosted on your iPhone or posted somewhere within the cloud where you simply ask it a you know question and then it's able to retrieve the information based on just an entire long document based on whatever query you might have so I'm guessing that they mapped out pretty much every question that you do have for your iPhone so that you can really understand how to do certain things because honestly guys a lot of the things that we do use always have secret features that we never really know so I'm guessing that this is going to be something that you know might be relatively useful and you can see here that once you ask Siri about your device you can literally then get it up here and it gives you this short informational tidbit on whatever it is that you may have asked so this is something that I guess it just increases the usability of the device but it's not something that is too crazy to where people are going to be thinking wow this is you know completely gamechanging now there was also something called app intense and this is where this is you know allowing you to do across app actions which is basically you know pretty similar to the Apple actions and the app intense allows you to do those things with I guess you could say the agentic capabilities so this is of course like I said something that's not crazy in its crazy phase like it's not something where they showed some insane demos it was truly some really basic stuff but I do think that now that like they actually have this platform where they're like okay we know that we can work with certain apps I think in the future there's going to be a lot more support for apps that you use on a day-to-day basis because what they did announce was they did announce that this app intense is going to be under an API which means that this is going to be something for developers so with that we now have a situation on our hands because we know that Developers rather than the actual you know company like

Segment 3 (10:00 - 15:00)

apple they usually make stuff that's really useful because of course they're going to try their very best to ensure that whatever app they have you know takes advantage of all the iPhone features in order to create the best consumer product and with the API it's likely that a lot of solo developers or you know other developers that aren't from these giant companies are going to be able to use the API to build certain products that are going to be taking advantage of the app intents that are going to allow you to do things that we weren't previously able to do on the iPhone now in addition to this there was also this right here and there was this small apple you know intelligence icon which is pretty interesting because it gives us some of the basic functionalities of AI systems that we've already had so right here you can see that you know someone's writing an email and this is of course on the Macbook so what we have on the left hand inside is we have this feature called rewrite so rewrite is essentially a feature where you can just basically rewrite stuff and this is something that like I said before it's not too game-changing you can essentially just rewrite your emails as friendly professional concise summarize key points table or list this is something that is I guess you could say you know just kind of like a wrapper over something that we've previously already seen and right here as well you can describe the change that you want to whatever text that you might need so this wasn't something that anyone was surprised by this is something that literally you could have done with GPT 3. 5 and something that we've seen that with apps that have existed for over a year so I got to be honest this point was pretty underwhelming because I'm sure there's a million different websites that a lot of people are even using with even like extensions and some kind of like popups that already have something right here and I guess is going to be a little bit difficult for those who are you know owners of those websites because what it seems to be is that a lot of what we're seeing from these companies is that they're basically just now baking in features natively to these platforms that already exist on third party website so I know there's a bunch of different websites that allow you to have an extension where you can literally have a menu that looks exactly the same but now Apple have just included it that is being powered by their on device AI now this was something that I think right here is probably one of the most useful things and I think this is a little bit Innovative because I haven't seen this being done before so what we have here is Inbox summary so if you're on iOS essentially what you have is you have your inbox and it's pretty crazy because when you get like you know 10 emails or whatever A lot of the times what will happen is you won't know exactly what's going on so what they'll do is they'll actually do a short summary right here and this basically just summarizes the entirety of the email so you can quickly at a glance look at exactly what your email says just down here and in just literally two sentences you basically get exactly what the email is about so this is something that is just probably going to make you a little bit more efficient you don't have to fish through all of your emails it just says next major revie in 2 weeks schedule meeting this Thursday of course retail partner visits confirmed prefer this yada y it's something that I actually think is very effective and then of course here what we did get this was this little summarized button so anytime you open an email that is relatively long you get this button that's it summarized and we can see what exactly the email is about if it's too long and another thing that I think is probably going to make you a little bit more effective is priority notification so I don't think that this is too crazy but I think that you know considering that maybe it's using Apple intelligence you might get a smarter understanding of what notifications are actually important to you right now so I'm guessing that it uses probably their small model on device to summarize what is happening and I guess kind of map out what is happening as instantly right now rather than what is going to be happening you know tomorrow so you can see that this one is around 10:00 a. m. tonight and this one is at 10:30 a. m. so I guess it just looks at what is going on right now the time of your device and then tries to just push those notifications to the top of your list so something that's a little bit Innovative but not something that I think you know with original developers couldn't have been done in a crazy way before now what we did also have was gen Emoji so this just takes advantage of generative AI capabilities and this is basically just building upon Apple's vast uh you know emoji IES that they already have so using a simple text

Segment 4 (15:00 - 20:00)

prompt you can essentially just use apple intelligence to describe a new emoji that describes whatever new novel situation you might be facing where an emoji just isn't enough to kind of you know summarize exactly or capture the essence of what it is that you're probably feeling so right here we have a lock Bagel a scroll DJ and this is something that like I said before it's not something that's going to Revolution anything it's just a consumer I guess you could say add-on that just makes conversations a little bit more interesting wouldn't be surprised if we start to see some viral posts about these gen emojis but um yeah it's just something that I thought I'd include in the video because it does take advantage of generative AI now there was also image playground which is where you can literally just have on device image creation based on whatever is going on your device so if you are texting with someone you can essentially make an image really quickly and makes around three different images based on whatever it is that you're talking about you can see we have a party we have a chef we have a cat and then it uses all of these to go ahead and create an image and then you can quickly go ahead and send that image if you need to do and you can edit those in three different ways so literally just generative AI images for messaging now what we also got was the image playground you can see right here this is the image wand so this was something that is I guess you could say a pretty Niche product it's not something with mass consumer appeal so basically you know how you have your notes app essentially what you can do is you can basically scribble on this area right here which is previously like a scratch sketch so if you had like a sketch of like I don't know maybe like a church or maybe like a cathedral or maybe some kind of structure what you could do is you could use your wand on that You Know sketch and then you could get that to be converted into some kind of image you can see right there it's able to describe your image with certain words certain understandings and you're literally able to get that image to assist whatever it is based on your lecture so the example that they gave us was architecture in India the 15th to 18th centuries and we kind of got this lecture thing going on here so pretty interesting way to kind of utilize the generative AI things on device but like I said before it's nothing too crazy another thing that we did get was actual searching in videos so you can actually search with natural language you don't actually need to search for specific things but I thought Apple's search was already really good like the ones that I honestly have on device now when I try to search for something you can really find it like it's really good but um you know now you're able to essentially get a short description of what's going on so if you have sh got something before on your iPhone and you know exactly what it is and you can describe it in natural language it's able to find it in the natural language which takes advantage of generative Ai and then of course we had something here which is rather interesting so I think this one was interesting because once again it takes advantage of Apple intelligence and essentially what it allows us to do is create custom content based on the stuff that exists on device now if you've had an iPhone you'll know already that this is kind of a feature which they already included where sometimes it will take your existing photos existing images and input that into some kind of you know Montage or kind of video or kind of short film that is just based on your memories around a certain period of time now with this one you can use apple intelligence to create that so you can say Leo learning to fish and making a big cash and then of course to a fishing tune so this is where you can use apple intelligence to create your own custom content based on the stuff on your device so you can see last summer in our garden you could put everything that happened in November I mean this one you'd have to be a little bit more creative but I would love to see how those custom videos come out based on you know the kind of selection that they choose and based on those kinds of things now this is something that's not really crazy but I think there are going to be more and more products that do offer this in the future now what we also did have was this okay and this was where we finally got the chat GPT integration with Siri so a lot of people are going to be asking Siri a billion different questions as many people do all the time but I'm guessing that since series you know whatever you know AI system is going to be on that device or on the cloud that's you know prioritizing speed over you know I guess you could say accuracy is going to have to sometimes

Segment 5 (20:00 - 24:00)

use chat GPT to be able to help complete exactly what's going on so you can see sometimes here it offers to use chat GPT to do that specific task so it's going to be able to analyze what kind of task you want wondering if you're going to use chat gbt if you need images if you need more accuracy and then it's probably going to go and use the server now I don't think it's going to take advantage of the 31 I think it's going to essentially allow you to sign in here so you can truly take advantage of the latest proprietary models I'm guessing that this is going to be a sign in feature and you might just be able to have it set to GPT 3. 5 but I am wondering if this is going to overload open AI servers because the distribution that Apple does have is pretty huge this is a huge level of integration that is of course going to bring a huge level of people using their software so it will be interesting to see if somehow Apple can I don't sure I'm not sure how but apple and open ey can manage the compute capacity because previously we know that there are always literally rate limits and rate caps on how much you can use their AI systems so the integration with Chad GPT it wasn't anything crazy it wasn't no GPT 40 where you got Siri uh doing some crazy kind of thing in the demonstration it was relatively straightforward we also got this final thing from chat GPT where on device on Mac we also got this chat GPT integration where when you're writing notes you can simply add images or you can simply change your words so overall you can see here that this WWDC the Apple intelligence you know announcement I think I wouldn't say it was overhyped but I would say that externally by people who were in the community looking at this event as if it was going to be some kind of crazy announcement maybe they didn't understand Apple's philosophy is one that is where they really try to just you know focus on I guess you could say perfectionism rather than being first to Market Apple would wouldn't care if you know Samsung and other competitors like Android managed to take you know a large market share because the distribution that they have and the reputation that they have so Apple would rather roll out you know small iterations to their software a lot more slowly as long as they work because they cannot afford a situation you know where something like a Google situation happens where the AI does something unexpected because it just would ruin their brand image so it will be interesting to see when Apple finally takes the giant leap on generative AI because I think that the leap will probably be next year because by then a lot of these hallucinations and a lot of the limitations of today's models will be solved so let me know what your favorite feature from the entire event was it the Gen mosy was it the inbox summaries which allow you to stay on top of your priorities and allows you to understand exactly what's going on are you excited for the Apple actions that are going to allow you to build some interesting Integrations I do think that if these Integrations were a lot more thorough they probably would have shown it because the demos that we did get were relatively basic I mean you could get one where they said edit this picture and it literally just edited the picture in a very small way now I would have included some of the videos but for whatever reason sometimes Apple just literally copyrights the video and I'm not able to even upload it so it's something that I had to just use these images of course if you want to watch the full stream I will leave a link in the description but like I said before I don't think this update was that crazy but I do think the actions is probably going to be the thing that you know developers and individuals are looking at because if other Android devices and makers see this and they able to do something relatively similar then that could revolutionize how we interact with phones on a day-to-day basis so that is basically it for WWDC 2024 in terms of the AI segment I'm sure other companies and other individuals are going to be giving a large piece of their opinions on what they're going to be doing in the future with regards to Apple but I think that this was relatively interesting so with that being said if you enjoyed this video don't forget to check out some of the links in below I'll see you all in the next one

Другие видео автора — TheAIGRID

Ctrl+V

Экстракт Знаний в Telegram

Экстракты и дистилляты из лучших YouTube-каналов — сразу после публикации.

Подписаться

Дайджест Экстрактов

Лучшие методички за неделю — каждый понедельник