# How does AI understand your prompts? #AI #NLP #MachineLearning #ArtificialIntelligence #Embeddings

## Метаданные

- **Канал:** Cognitive Class
- **YouTube:** https://www.youtube.com/watch?v=dNXdAazTXf8
- **Источник:** https://ekstraktznaniy.ru/video/32700

## Транскрипт

### Segment 1 (00:00 - 01:00) []

Have you ever wondered how AI understands the prompts that you give it? Whether in English or one of the 80 plus languages that chat 2PD supports. It starts with something called embeddings. When you type a prompt, the AI breaks it down into tokens. These are small chunks of text like whole words or even parts of a word. Each token is then converted into a vector which is a list of numbers that represents its meaning based on how it was used during training. These are called word embeddings. You can imagine embeddings as points on a map. Words with similar meanings appear close together like dog and puppy or king and queen. But embeddings alone don't capture the full meaning. Let's say you write dog bites man. No big deal. But if you write man bites dog that's unexpected and very weird. Even though the same words are used just in a different order, the meaning completely changes. Transformers take those word embeddings and make them more useful. They use positional encoding to keep track of word order and attention to figure out which words matter the most. Models like chatbt predict the next word in a sentence while models like birds predict a missing word. This helps them to understand the context which is especially useful for search engines. For example, if you use the word bank, the model can tell whether you're referring to a financial institution or a river bank based on the surrounding words. Once the model understands what you're trying to say, it generates a reply by predicting the most likely next token again and again until you get a full response. And that's how AI understands your prompt in everyday language. If you're curious how AI generates responses and why they might sound a bit different from human written text, follow along as we'll explore that in an upcoming video. And if you want to learn how embeddings work hands-on, check out Cognitive Classes Guided Projects.
