WOW! OPENAI RELEASED OPEN SOURCE MODELS! HERE'S WHAT IT MEANS FOR YOU! (GPT-OSS Guide)
7:41

WOW! OPENAI RELEASED OPEN SOURCE MODELS! HERE'S WHAT IT MEANS FOR YOU! (GPT-OSS Guide)

Alex Finn 05.08.2025 6 301 просмотров 281 лайков обн. 18.02.2026
Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI
Описание видео
OpenAI finally did it! They released TWO new open weights models. Here's what it means for you and how to set them up! https://ollama.com/ Follow my X: https://x.com/AlexFinnX Sign up for my free newsletter: https://www.alexfinn.ai/subscribe My $300k/yr AI app: https://www.creatorbuddy.io/ 0:00 intro 0:32 What these models are 1:49 Why this is important 4:32 How to set this up

Оглавление (4 сегментов)

  1. 0:00 intro 91 сл.
  2. 0:32 What these models are 239 сл.
  3. 1:49 Why this is important 538 сл.
  4. 4:32 How to set this up 709 сл.
0:00

intro

Massive breaking news from OpenAI that will change the industry forever. They have released two new Open Weights models. They are finally living up to their name Open AI. In this video, I'm going to explain to you why OpenAI releasing Open Weights models is so important and changes everything. What this means for you and how you can run and customize these models on your own laptops right now immediately. Let's get into it. OpenAI just released these two new Open Weights models, 120B and 20B. And this is pretty
0:32

What these models are

incredible. These are actually great models that you can download and run locally on your computers today. They're built for agentic tasks, which means they can they're great at running tools, so they can search the web very easily. They have complete open chain of thought, which is incredible. No models say really are completely transparent when it comes to chain of thought. They usually censor their chain of thought. These are completely open chain of thought models. They're fully customizable. So, so all openweight models you can download to your computer and customize them any way you want, which gives you a level of control you've never been able to have before with models on the web. From a performance perspective, check this out. It's amazing. their smaller weight model which can basically run on almost any modern device that the 20B model uh is basically comparable with 04 mini which 04 mini is a fantastic model and then their larger model 12B which you do need a more modern device for probably like a 60 GB memory laptop or computer it's comparable with 03 which in my opinion is the greatest AI model I've ever used in my life so they're not holding back by open sourcing like old models these are cuttingedge modern models you can run locally on your computer right now, which I'll show you how to do in a second. So, why is
1:49

Why this is important

this so important? Why is releasing openweight models such an important thing to do in AI? And that's going to change the industry forever. Up until now, almost every Americanmade AI model has been closed source. Meta has released a few openweight models, but they really weren't that good or cutting edge, so not many people are using it. But this is the first time a cuttingedge model has been released that is openweight that you can download and run locally. Chinese models have been open weight now for a long time which is great for them. But if you're an American, you want Americans to win the open source race. But why is open weight so important? Why is it such a good thing we're getting open models? Imagine this. You're hungry and you need to eat dinner, right? With closed source models, your only option for eating is going to a restaurant, sitting down at the restaurant, and getting handed a meal. You can never see the kitchen. You can never know the ingredients in your meal. And that's your only way you can eat. With open weights models, instead you can make your dinner at home. You can make it by yourself. Customize it any way you want. Use any of your ingredients. You understand exactly what you're eating. So, what's the advantages of eating your dinner at home? Well, let me tell you three big advantages. Cost, privacy, and control. When you're using closed source models on the internet, you're paying for every single use because you're using their servers, their GPUs, and making API calls to their models on the web. When you run locally, there is zero cost. Everything runs on your computer. You're not using any other sources. The only cost is the cost of the energy of your computer running. So, you're saving tons and tons of money by running your models locally. You basically have unlimited usage. It's also completely private. When you use closed source models online, every prompt you send, every PDF you send, every document you send, anything you send is actually stored on those companies servers and they have access to it. In fact, and not many people know this, if the government were to subpoena those records, they can get your complete chat records. That means nothing you do with AI online is private to yourself. Everything you say, every weird thing you say to AI, I know you're saying weird things, you freak. The companies and the government can see it. Now, with it running locally on your computer, everything is private. You don't even have to connect to the internet. You can run it locally. And lastly, control. You have control over these models any way you want. You can customize them make it so they're uncensored. So, they say any weird thing you want it to say. You have 100% control over the outputs. you get from these models, which is amazing as well. Great for customization if you're into that. So, let's go hands-on. Let's demo it. I'm going to show you how you can download these models now and start using them locally immediately. So, the easiest way to
4:32

How to set this up

download and run these models locally is through Olama. Go to olama. com. I will put the link down below so you can get that now. Download that. You're good to go. Once you installed that, this is what you're going to see. This is the Olama window. All you're going to need to do now is go in here, go to the drop-own menu with all the models in there, and choose GPTOS 20B. If you're on basically any normal laptop or computer, you would choose 12B if you're on a sick $10,000 Mac Studio, which I plan on getting soon so I can test this out. But for most computers, you want to choose 20B. You click that. Once you click that, you can start sending messages. You can send any message you want like a normal AI. So, I'd say, "Hey there. " I hit enter. And what's going to happen is the first time you do this, it's going to download the model for you locally, right? So you choose the model, you send a message, it starts downloading the model locally. This could take anywhere from 30 seconds to a full minute. Then after that, your first couple prompts are going to take a little bit longer because it has to actually compile the model and start getting it up and running. But once you do that, you're good to go. All of this now, all these conversations are now happening locally on my computer. Nothing is going to the internet. Nothing's being stored on servers. Nothing's being sent to the government. Nothing's being read by any employees of these companies. It's all locally on your computer running off your own chips. And this is unlimited. You don't have to pay for any of this. This is completely unlimited. If I wanted to now set this up to cursor and set it up to client inside a cursor, I can do that. I can have unlimited code generation. So, I asked for what the latest AI news is, and this is incredible. Take a look at this. You can see the full chain of thought. So you can see everything it thought about to get these answers which you cannot see on other models. They give you rundowns of what it's thinking but it's very censored. This is the full chain of thought uncensored and then it got the full answers and this is up to date right it's able to get the latest news from the internet and give it to us and these are really good answers and this was all done locally. No one has access to this. No other companies can see this which is incredible. There are so many advantages now to being able to do this. You're going to save on costs. You're going to be able to customize the types of answers you get. And you're going to be able to do things locally that you were never able to do before. If I wanted to set this up so it's running locally and just controlling my computer 24/7, writing articles, doing research, I can do that because now that's economically viable since this is local. And I'm going to do that. I'm going to record another video showing you how you could set this up to run your computer. There's so many things you can do now that you have local models running. Hopefully, this is something other companies in America follow as well. and they start open weighting their model. It's going to give so much freedom to people who want to use AI. And on top of that, you now have the open- source community working on your AI models, too, to make them better. There are so many advantages to being able to do this. Massive day. Shout out to OpenAI for finally doing this. They're putting the open back into Open AI. In my next videos, I'm going to show you all the cool things you can do with local models. So, make sure to leave a like and subscribe to see those. Turn on notifications so you get them the moment they come out. and I'll see you in the next

Ещё от Alex Finn

Ctrl+V

Экстракт Знаний в Telegram

Транскрипты, идеи, методички — всё самое полезное из лучших YouTube-каналов.

Подписаться