DLSS 5 Explained Clearly In 8 Minutes (How It Actually Works)
8:04

DLSS 5 Explained Clearly In 8 Minutes (How It Actually Works)

TheAIGRID 18.03.2026 4 473 просмотров 175 лайков

Machine-readable: Markdown · JSON API · Site index

Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI
Описание видео
🌐Subscribe To My Newsletter - https://aigrid.beehiiv.com/subscribe Get your Free AGI Preparedness Guide - https://theaigrid.kit.com/agi 🎓 Learn AI In 10 Minutes A Day - https://www.skool.com/theaigridacademy 🐤 Follow Me on Twitter https://twitter.com/TheAiGrid Links From Todays Video: https://www.youtube.com/watch?v=MhLWH18vXH4 Welcome to my channel where i bring you the latest breakthroughs in AI. From deep learning to robotics, i cover it all. My videos offer valuable insights and perspectives that will expand your knowledge and understanding of this rapidly evolving field. Be sure to subscribe and stay updated on my latest videos. Was there anything i missed? (For Business Enquiries) contact@theaigrid.com Music Used LEMMiNO - Cipher https://www.youtube.com/watch?v=b0q5PR1xpA0 CC BY-SA 4.0 LEMMiNO - Encounters https://www.youtube.com/watch?v=xdwWCl_5x2s #LLM #Largelanguagemodel #chatgpt #AI #ArtificialIntelligence #MachineLearning #DeepLearning #NeuralNetworks #Robotics #DataScience

Оглавление (2 сегментов)

Segment 1 (00:00 - 05:00)

On March 7th, 2026, Nvidia showed off a demo of Resident Evil Aquam where a character named Grace Ashcraftoft appears on screen, and the internet immediately noticed something weird. Her lips were fuller, her cheekbones were sharper, and her makeup had changed. Nvidia didn't update the game. They'd run it through an AI model, and within minutes, the phrase AI slot builder was trending. So, let's explain it. This is DLSS5, the technology that Nvidia's CEO, Jensen Huang, is calling, and I'm quoting him directly, the GPT moment for graphics. This was unveiled at GT6 2026. It's supposed to be the most significant breakthrough in computer graphics since real-time ray tracing in 2018, but the internet thinks it's an Instagram beauty filter for video games. So, what is DLSS5 actually doing? Is this genuinely the future of how video games will look? Or has Nvidia just built a very expensive Yasifi button? So, DLSS5 is just not upscaling. It's not frame generation. It's a neural rendering model that takes each frame your game engine produces and reimagines the lighting, the materials in game and tries to make them look photorealistic. It's the first time Nvidia has used AI not to boost performance, but to fundamentally change what a game looks like. And that's exactly why it's so controversial. So, how does this actually work? Let's dive behind the technology to figure out what Nvidia is doing here. You see, every previous version of DLSS was about efficiency. DLSS Super Resolution renders at lower resolution and uses AI to upscale. Frame generation creates fake in between frames to boost smoothness. Both are about making the same game run faster, but DLSS5 does something fundamentally different. So, here's the DLSS5 pipeline. The game engine renders a frame normally, all the geometry, textures, the lighting, the field deal, and then it outputs two things. The color buffer, which is the actual image that you'd see, and the motion vectors, which is to track how every pixel is moving between frames. The LSS5 takes both of these as input, and then a neural rendering model trained end to end on Nvidia supercomputers analyzes the frame. And this is the key part. The model just doesn't look at the pixels. Nvidia says it understands the entire scene semantics. It recognizes the characters, the fabrics, the hairs, the translucent skin. It reads the environmental lighting, whether the scene is front lit, back lit, overcast, all from a single frame. And then it generates a new version of that frame with what it believes is photoreal lighting and the materials should look like. Think of it this way. The path tracing tells you where the light should go, the accurate positions of shadows, reflections, and bounce light. The LSS5 takes that a step further and tries to make those lighting interactions like what they would look like in reality. Subsurface scattering on the skin where light passes through and gives flesh that warm translucent quality. The delicate sheen on fabric. The way individual strands of hair catch and scatter light. These are effects that even modern ray tracing struggles to do convincingly in real time because they demand enormous ray budgets. The LSS5 tries to infer them from the AI model. Instead, the output is anchored to the source 3D content. It's not generating new geometry or replacing textures. It's overlaying a neural rendering pass on top of what the game engine already produced, keeping the structure and motion of the original scene intact while transforming how light behaves across surfaces. At GTC, Nvidia demoed this across four games, Hogwarts Legacy, Starfield, Oblivion Remastered, and many other games. But the big thing to understand here is that the GTC demo required two RTX 5090 graphics card. One to actually render the game and one dedicated entirely to running the DLSS5 neural model. Each of those cards starts at around $2,000. And Nvidia says the plan is to optimize it down to a single GPU by the full launch. But right now, this is a 2GPU technology running on $4,000 worth of hardware. We need to get into the controversy where you have the backlash that was almost immedious and fierce. Why are people upset about this? So PC gamers Tyler Wild watched Nvidia's demo frame by frame and concluded that Grace Ashcrooft's face in Resident Evil Require wasn't just lit more realistically. The facial features had been structurally altered. She had fuller lips, sharper cheekbones, and he called it an apparent bias for certain beauty standards trained into the AI model. The word that stuck, yification, as in the meme where AI tools make everyone look like a filtered Instagram model. Will Smith and not the actor, the co-founder of Tested, posted on Blue Sky. Nvidia, what if we introduce ray tracing so you can have really high quality real-time lighting? Also, Nvidia, what if we throw out that lighting to run the Yasifi filter so everyone looks hot? The deeper concern goes beyond individual faces. For years, DLSS was purely additive. It made games run better without changing what they looked like. This was the promise and DLSS5 breaks that promise explicitly. You see, Nvidia pushed back fast. They pointed out that developers have full artistic controls, intensity sliders, color grading, and per region masking through the streamline, letting artists decide exactly where DLSS 5 and how the effects are multiplied. Bethesda stated

Segment 2 (05:00 - 08:00)

publicly that DLSS 5 support in Starfield and Oblivion Remastered is entirely under its artists control. And critically, DLSS5 is toggable. If you don't like it, you can simply turn it off. But Digital Foundry raised another concern. Because DLSS5 integrates through Streamline, the same framework the models already use to swap DLSS versions. It's likely that enthusiasts will force DLSS5 onto games that were never fully designed for it. The official developer controls won't matter if modders bypass them entirely. Now, the backlash was immediate, but you have to understand that this is not just an isolated incident. It's part of a much larger shift that the entire industry is moving towards. The LSS5 is not just one experiment. You see, at CES 2026, Nvidia's Jenzen Huang said outright, "The future is neural rendering. " And Google's Deep Mind already demonstrated Gen 3, a world model that generates entire interactive 3D environments from text prompts in real time at 24 frames per second. No game engine, no handplaced asset, pure neural generation, and it's early limited to a few minutes of consistency. But if we look at the trajectory, I think it is pretty clear. The LSS5 sits at the halfway point on that trajectory. It does not replace the entire game engine. The game still renders every polygon, every texture, every shadow. The LSS5 layers a neural enhancement on top. It's a hybrid hand-crafted rendering experience plus generative AI. And if Nvidia is right that the model will keep improving like DLSS upscaling has over the past six years, the early versions that we're seeing now could make DLSS 1. 0's blurry mess compared to what the ships in 2 to 3 years. The supported game list already tells you that the industry is already taking this seriously. You've got Bethesda, Capcom, Ubisoft, Tencent, Warner Bros. Games. All of these companies are on board and over a dozen titles are announced for DLSS5 at launch this fall including Assassin's Creed Shadows, Phantom Blade, Resident Evil, and Starfield. These aren't indie experiments. These are tentpole releases from the biggest publishers in gaming. Now, if you're wondering why does DLSS 5 matter now, it's because we are in March 2026. The AI backlash is at full volume. Artists are fighting generative AI in courts, in studios, and on social media. The gaming community has actually spent years pushing back on AI generated content in the games that we play. And in this environment, Nvidia walks onto stage and announces what many people see as an AI filter that literally changes how game characters looks and calls it the GPT moment for graphics. The timing could not be worse for Nvidia's messaging. The technology itself is real, and by most hands-on accounts, it's genuinely impressive when applied well. PC Mags reviewer said it was the most lifelike gaming graphics I've ever encountered. And Tom's hardware called it exciting. While cautioning about the face problem, even the heartish critics acknowledge that the environmental lighting improvements, the rim lighting, the material spots, the subsurface scattering, all of those represent a legitimate leap. The real battle isn't over whether neural rendering works. It clearly does. It's over who gets to control what the games look like. the developer, the player, or the AI

Другие видео автора — TheAIGRID

Ctrl+V

Экстракт Знаний в Telegram

Экстракты и дистилляты из лучших YouTube-каналов — сразу после публикации.

Подписаться

Дайджест Экстрактов

Лучшие методички за неделю — каждый понедельник