ChatGPT's NEW Voice Mode, Group Chats, Shopping Research & More!
15:04

ChatGPT's NEW Voice Mode, Group Chats, Shopping Research & More!

The AI Advantage 28.11.2025 31 120 просмотров 899 лайков обн. 18.02.2026
Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI
Описание видео
Get your new AI development companion Matter by JetBrains today! 👉 https://jb.gg/aiadvantage-Matter ChatGPT got a bunch of brand new features this week like an all new Voice Mode, group chats, shopping research, and more. In this video Igor reviews all the new ChatGPT features, plus he breaks down everything else from the week in AI you need to care about like Meta SAM 3, Flux.2, Codex Max and more. Enjoy! Links: 🔑 Free ChatGPT Prompt Templates: https://bit.ly/newsletter-aia 💼 AI Advantage on LinkedIn: https://bit.ly/AIAonLinkedIn 🧑‍💻 Igor Pogany on LinkedIn: https://bit.ly/IgorLinkedIn 🐦Twitter/X: https://bit.ly/AIAonTwitter 📸 Instagram: https://bit.ly/AIAinsta https://x.com/OpenAI/status/1993381101369458763?s=20 https://chatgpt.com/ https://openai.com/index/chatgpt-shopping-research/ https://openai.com/index/group-chats-in-chatgpt/ https://bfl.ai/blog/flux-2 https://playground.bfl.ai/image/generate https://ai.meta.com/blog/sam-3d/ https://www.aidemos.meta.com/segment-anything/editor/segment-video/?media_id=1507891820422074&template_id=1178631427507494 https://openai.com/index/gpt-5-1-codex-max/ https://www.claude.com/claude-for-excel Chapters: 0:00 What’s New? 1:04 New ChatGPT Voice Mode 4:38 Matter by JetBrains 7:21 ChatGPT Shopping Research 8:52 ChatGPT Group Chats 10:33 Flux.2 11:41 Meta SAM 3 13:55 GPT-5.1 Codex Max 14:20 Claude For Excel This video is sponsored by JetBrains. #ai #chatgpt #gemini

Оглавление (9 сегментов)

  1. 0:00 What’s New? 237 сл.
  2. 1:04 New ChatGPT Voice Mode 805 сл.
  3. 4:38 Matter by JetBrains 609 сл.
  4. 7:21 ChatGPT Shopping Research 340 сл.
  5. 8:52 ChatGPT Group Chats 376 сл.
  6. 10:33 Flux.2 260 сл.
  7. 11:41 Meta SAM 3 445 сл.
  8. 13:55 GPT-5.1 Codex Max 100 сл.
  9. 14:20 Claude For Excel 172 сл.
0:00

What’s New?

Another crazy week in the AI space. And I don't say that every single time, do I? Welcome to another week in AI. And this week has been absolutely insane. And it's a very practical one, if I may say so myself. This one was particularly interesting. Releases are not slowing down. Seriously, there has been a competition going on between the biggest players to get the consumer's attention, and they're trying to outbid themselves. Two weeks ago, we got GPT 5. 1 and 5. 1 Pro. Then we got Gemini 3 Pro. And this week we got Claude Opus 4. 5 which I covered in a separate video. But then Chat GPT launched all of these features inside of Chat which is their way of telling consumers, hey, we still have the best platform and here's even more stuff that you will love. In other words, it's not just about the models. Chat GPT is the product you want, not just the model underneath it. And that's what we'll be looking at today. There's four new features in Chat GPT. There's a brand new video selection model from Meta. It's actually incredible and so much fun. and a few more stories. All in this week's episode of AI News You Can Use, the show that pulls together all the releases in the AI space. We select the best ones, and I get to tell you all
1:04

New ChatGPT Voice Mode

about it. Chad GBT finally did something that people weren't maybe even asking for, but it just makes so much sense, and it's an evolution of one of my personal favorite features, the voice mode. Concretely, the advanced voice mode. Now, just as a quick recap for anybody who doesn't know, there were really two voice functions in chat. One of them was just dictation where you hit this button and you could speak into it and then it transcribed it into here. Just a different way of inputting things. I really love that because then you can write at the speed of your thoughts. The second one is the voice mode. Traditionally, when you used to press this button, it went into a separate interface where you converse with it. But none of that text actually went into the chat and it behaved very differently from just interacting with a new chat. Now, they changed this and I'm going to try this for the first time here in this video. Let's try this. This should be a completely new experience, right? I need headphones, actually. Oh, no. Okay. Hey, what's up? — Hey, Eigore. Not much. Just here and ready to help out. Sounds like you're getting settled in with your headphones and all. So, whenever you're good to go, just let me know what's on your mind. — Wait, hold up. Do you know about the headphones? I didn't tell you about the headphones. — Ah, fair question. So, I just picked up on that from the little audio snippet where you mentioned needing headphones. Nothing mysterious, just my ears at work. But like, look at the conversation. I don't mention headphones. What are you listening in without me pressing the button? — No worries, Eigor. I get why that sounds a bit spooky. Just to clarify, I can only pick up on what you say once you start talking to me. Earlier, you did mention needing headphones in the audio clip. — Yeah, earlier, but not right now. What do you mean? Oh my god. Wait, I need to end this. Look at that. It's not in the context. Did it get picked up in memories? Ah, okay. When I pressed the button and realized I didn't have headphones on and then I put them on, it caught it. And then because I have the memory settings on here on the personalization. Yeah, it references chat history. So that's how it knew. Okay, that was a bit spooky there for a second. But you can see it actually proves the point and the big improvement here in a big way. Everything is now transcribed just as if you were dictating and you can have an interaction with it. Whereas before if you pressed this voice mode and it was answering with its voice, you didn't get the text in here and you didn't get to continue the conversation. This is a great way to start a conversation and to seed it with context. You can talk into it. You can ask it questions. It responds and all of it is text which then you can use in any way, shape or form. It engages with the features like here it did with the memories and the history enabled or you can easily copy paste it somewhere else. You can simply work with it whereas before it just used to show three dots and you didn't see the conversation in its full length that was actually had. This is amazing. I recommend to a lot of people, especially newer people and especially people that did not grow up with technology to start using chaty through the voice mode. I always told them just press dictate, talk into it and it will talk back. Sometimes I recommended voice mode, but the thing is with voice mode, it all got lost in the previous version. Now it doesn't get lost anymore. It's all there. You can work with it interactively using your voice. I think that's actually huge. I cannot underline enough how much of an annoyance it was that none of it was present. I personally haven't used this voice mode in probably a few months now. I use it to demo sometimes but in everyday usage I never went to that. I always used the microphone because I wanted the conversation to be explicitly here so that I have a chance to actually make changes to branch it to edit it whatever it might be. Now I can do that with voice. And honestly, for me, this is going to make more of a difference in my everyday usage of these apps than a coding model performing five or 10% better. For everyday usage, a feature like this is huge. Let's see what's
4:38

Matter by JetBrains

next. Okay, so next up, I will show you how to literally talk to your codebase, even as a non-developer. And that is with Matter, a new AI development companion for product teams created by Jet Brains, the sponsor of today's video. And let me tell you, this tool goes way beyond vibe coding, little apps, or web pages. It works in a way where you connect your entire repository and then anyone on for example the product team can prototype on the real app that you're building as a company on the actual codebase without breaking anything. A lot of existing tools in this category have limited functionality. They only allow you to make little demos or mockups of pages. But matter has been designed with real teams in mind from the get- go. So let's say you're a product designer or a product manager. The way this works is you can open matter, connect your GitHub repository, and make changes to the application just by talking to Matter. This changes the currently live code and instantly previews the changes that you make in the window on the right side. Let me show you. So, I recently made this project on a channel called the No Machine. It's app that helps you say no to certain opportunities. And while it works well, I wanted to make changes on the visual design side. In fact, I would really like the color scheme to mirror the design principles on our website. You can see it right here. It's as simple as taking a screenshot of the website and sending it to Matter along with a prompt asking it to align the color scheme of my existing app, the no machine, with the colors in this screenshot. This is what that would look like. All right. After you send that, Matter analyzes screenshots and then immediately gets to work applying all of the changes. And there you go. That's already it. As you can see, the preview updates to show you what all the changes will look like in production if you choose to apply them. And by the way, this preview is fully functional. You can test it all out and iterate on it as you work on the design and layout of this page. Heck, you could even create GitHub pull requests directly from this interface in matter. And the good news is all of this happens in isolated environment. So you playing around is not going to change the live codebase yet. And as I mentioned earlier, the entire product is designed with teams in mind. So if somebody's experimenting and they want to see what a site would look like, they can do that and they can send that preview to other team members for them to add their two cents. Matter seriously helps with this constant back and forth between the product team, the design team and the developers. And specifically, both designers and product managers can build what they have in mind with just natural language. And then developers get the code with clear context and all the documentation they need to implement it themselves without non-technical people interfering with their existing workflows. If that sounds interesting to you, check out Matter today and get ready to build and improve on real web apps and websites with the help of AI. You can check out the link at the top of the video's description to sign up for the recently opened early access. And thanks again to Jet Brains for sponsoring this video. And now let's look at the next piece of AI news that
7:21

ChatGPT Shopping Research

you can use. And the next story is another Chhatri feature. This one is called shopping research and it's exactly as it sounds. It's a new way to research shopping alternative. You can find it in chat if you go to plus and here under shopping research. You can now ask I'll just say wireless headphones under €200 in Portugal. And then it will open this custom interface that just focuses on finding you alternatives and it researches products which is an amazing thing for Black Friday. Whenever you find a deal, I recommend you use this to really research if it's the deal that they advertise. You know, a lot of the tricks they use with this you can really research that well. It's a variation of deep research if you're already familiar with that. But this one is really focused on the shopping aspect and it works surprisingly well. We tested the shopping features before. At that point in time, I remember we couldn't detect any bias in there for specific websites or vendors. And while that is kind of hard to guarantee because we don't see under the hood, it does a great job. And then there's a comparison table with some of your best alternatives. I mean, maybe this is not super comprehensive. I myself use the Bose ones and AirPods Pro and it doesn't even list them. But to be fair, we told it to find wireless headphones under 200 and aha actually both the headphones that I listed, they cost more than €200. So this actually did a great job. And then if you have a specific product, you can kind of just take that product and look for it with the shopping feature and it will find you different vendors for it wherever you are if you specify your country. Really useful right now for Black Friday and Cyber Monday. Hey, if you're enjoying these stories, make sure to subscribe to the channel. It really helps out. And now let's look into the
8:52

ChatGPT Group Chats

next story. Next up, there's a really fun one. They introduced group chats in chat. This came out late last week, but I didn't get around to showing you yet. So, let's do that. And the way it works is basically if you're in a nude chat, you can press here, start a group chat, and then you can share the link with people. I did that with my team and I started the conversation. Generate 10 progressively clickbaity YouTube thumbnails for a video on new chat chip features like group chats and the updated voice feature. And then it started generating an image, but it didn't really do that. But Daniel, our production manager, followed up and said, "Hey, Chat Chippity, do what he said. " And it did it. There's nine productively clickbaity ones. Mind-blowing updates, of course, being the top one. So, how about this? Oh, look at that. Good stuff coming in. Gustaf, what's up? He's just chilling over here. — Hey guys. — And Gustaf is participating in the group chat cuz I gave him the link to it. And now he said, "Make it more extreme. " And this is really the feature. You could share a link around and collaborate on a chat. A few weeks ago, they rolled out collaboration on projects, which is amazing. But this is really fun because it's not just for work purposes. You could also use it to just have fun with some colleagues, get a chat chippity conversation going and do something like this. Look at that. That is way more extreme. Every single one of them. So I could follow up and now make them all aliens. And see this way you can just go back and forth. Have fun with this. We could brainstorm ideas in here. Honestly, a great way to collaborate remotely rather than talking about it in a Slack chat. You can just do this and have it generate images outputs and drafts of what you need while everybody shares the context. And it's actually a really fun one. And I was surprised myself. Insane update. Yeah, this is what YouTube thumbnails should look like. Okay, I think you see the point. Group sets and chat GPT
10:33

Flux.2

ladies and gentlemen. So, the next story is rather quick, but it's Flux. 2. Flux already being probably the most popular open-source model out there. And this is version two, which can do things like extensive text. Now, it's still a openweight model, but if you want to use the images for commercial purposes, you do have to get one of these licenses depending on which one you want. So, flex. 2 to dev cost 2K per month if you want to do 200,000 images. If you just want to try this out and use it yourself, you can go to their playground and generate images in here. And we did this for you and ran some of our test prompts. And you can see it right here, Flux 2 Pro, comparing to some of the other best models. Now, here's the thing. All of the models got so good at these comparisons that this sheet slowly but surely is converging to just a grid of images that are very similar. But I will say this from the open models in terms of benchmarks, it comes in a bit lower than Nano Banana Pro. But it has the ability to take multiple images and merge them into one. Really making this viable for production workflows rather than just a random image that you want to generate. In other words, it's good at editing and compositing. And on the playground, you get 50 images for free and it can do text and all of these other things that are really incredible
11:41

Meta SAM 3

actually. Let's see what's next. And the next one, honestly, this could have been a standalone video. This looks so fun. This is Meta's segment anything model. They had previous versions of this, but basically you can do a lot of things in here. You can create 3D scenes, bodies, you can do all of these effects. Some of them are just fun like these bobbleheads. Others are visual like these contour lines and others are just useful like automatically blurring faces, license plates. But at the core of all of this is a model that basically does visual effects work with the use of AI. It can select backgrounds, apply all these effects. Let's just try something. I mean, hm. Yeah, let's do the bobbleheads. Let's see. It has these preset videos. This man probably dancing on these stairs. Okay. So, that's the raw video. Now, let's apply. Okay. All right. That's kind of a fun ending. So, I really want to try this with a video of my own. So, let's upload something. Okay. So, I have a little phone clip of me this summer doing a wakeboarding trick. It actually took forever to learn this thing. Let's upload it and try this effect on it and a few more things. Again, you can upload your own videos. This is free to use and you don't need any video editing knowledge. It's kind of just fun. It's preparing the video. Um, I'm not really doing anything. Just really want to highlight that. We're going to apply the template and see what this looks like. Okay. Yeah, that actually tracked it so well. No. What? Even through the flip. I mean, okay, it's not perfect perfect, but come on. That's so hard to do. I'm impressed. Okay, maybe let's try one more effect with this. But I think you're starting to see the point here. Okay, I want to try the clone squad. Finds the person. That's the video. Then let me apply template. Okay, this seemed to have worked. Whoa. What? I mean, that's like a really hard task. No, I mean I can only imagine if I were to select this myself like here with the water. This is really good. It's so hard to select an object with all of these edges. How is it so good? — M so good. — Keep in mind this is the real one. So you would need to cut all of this out manually and it just got it. Okay, not bad. Really fun. Go and play with this. You can upload your own videos and yeah, share these with friends with the click
13:55

GPT-5.1 Codex Max

of a button. What a great app. And for this week, quick hits. We actually have just two, but they're both great. I mean, one of them is GPT 5. 1 Codeex Max. That is a mouthful. Available through the API. That is their best coding model competing with Opus. that again as I mentioned in the intro we created a separate video for so if you want to hear about the world's best coding model in terms of benchmarks but also vibes as many people say you can check out that video but this is their API version to
14:20

Claude For Excel

compete with that and then we also have claw for excel coming out of early access for all beta testers so if you use excel regularly I didn't have the chance to really test this yet but you can install it now and with the paid plans you can access claude working with an excel sheets and then you should be able to use the new models in there already sonnet 4. 5 5 was the best model to generate pretty and functional Excel sheets with. Now, Opus raises that bar and you get it as a native extension that has just been built for Excel. So, if you work with an Excel, strongly recommend you go check this out. We'll try it out. But yeah, it's finally out of early access. And that's pretty much everything I got for you in this week's episode. I hope you found something that was interesting or inspirative to yourself. My name is Igor Pagani and as per usual, I hope you have a wonderful

Ещё от The AI Advantage

Ctrl+V

Экстракт Знаний в Telegram

Транскрипты, идеи, методички — всё самое полезное из лучших YouTube-каналов.

Подписаться