What NVIDIA's Neural Rendering Means for PC Graphics

When NVIDIA CEO Jensen Huang took to the stage in Las Vegas’ 12,000-seat Michelob Ultra Arena to lay out the company’s vision for its next-generation GPUs at CES 2025, he proclaimed that “AI is coming home to GeForce."
He wasn’t kidding. The RTX 50-series will support a buffet of all-new AI features which include DLSS 4 with multiple-frame generation, a new model architecture that improves DLSS image quality, and a host of features called Neural Shaders. NVIDIA says that up to 15 out of every 16 pixels rendered by the 50-series will be AI generated.
That’s a big leap forward for neural rendering, but not a huge surprise. In 2022 I spoke with NVIDIA’s VP of applied deep learning, Bryan Catanzaro, about neural rendering’s future. At that time, NVIDIA cards could already render up to seven out of every eight pixels with AI, but that was just the start. “I think that we’re going to see a lot of neural rendering techniques that are even more radical,” Catanzaro said.
With the RTX 50-series, those "even more radical" techniques are here.
What is neural rendering?
Traditional graphics rendering uses complex algorithms to determine the color of each pixel for every frame you see and executes those algorithms in real-time. The most common example is the rasterization of 3D graphics, which converts potentially millions of 3D polygons into a flat 2D frame, with convincing perspective and lighting.
Neural rendering is a different approach. Instead of converting 3D polygons into a 2D frame and attempting to calculate all the variables that might influence every pixel in the frame, a neural network is trained to predict the correct output when given a specific input (in this case, a natively rendered frame from your favorite game).
It’s still complex, of course, and training the AI model requires a whole datacenter. But the benefit for gamers is that neural rendering requires less real-time compute than native rendering.
DLSS 4 turns up the dial on frame generation
This isn’t speculation. DLSS 3’s frame generation proved that a GPU designed for neural rendering can deliver frames more efficiently than a GPU with a focus on native rendering. The NVIDIA RTX 4070, for example, already delivers excellent frame upscaling and generation features
DLSS 4 on the RTX 50-series will take NVIDIA’s neural rendering advantage up several notches. While DLSS 3 could add one generated frame after each native frame, DLSS 4 can add three generated frames. It also generates frames with a new type of AI model that should deliver even better image quality.
The scaling isn’t exactly linear, but it’s rather close to it, and NVIDIA makes some staggering claims. In demos, games can run up to 8 times quicker when DLSS 4 is in use than when no DLSS is in use at all (keep in mind this includes both frame generation and upscaling). In games that support DLSS 4, NVIDIA’s RTX 50-series will have a huge advantage.
It’s not just about framerates
When I spoke to him back in 2022, NVIDIA's Catanzaro told me that neural networks can learn to render shadows, reflections, and other complex physical interactions without specific instructions on how to do it.
That has huge implications for graphics rendering. Every new feature in traditional rendering represents a huge feat of software engineering, because the code responsible for rendering the graphics must be specifically defined.
That’s why most game studios have abandoned in-house engines in favor of Unreal or Unity. Programming computers to create ever-more-accurate graphics is difficult, time-consuming, and expensive.
Neural rendering offers an alternative path. Instead of building or buying sophisticated graphics engines that try to accurately simulate every pixel on the screen, developers can concentrate on natively rendering a smaller amount of what’s displayed and then use neural rendering to pick up the slack.
For NVIDIA, this means developing neural shaders for the RTX 50-Series that concentrate on specific areas, like RTX Neural Materials and RTX Neural Faces. For example, RTX Neural Materials can compress detailed material data, making it possible to deliver extremely high-quality materials at a fraction of what was previously required. RTX Neural Faces, meanwhile, can generate character faces at a level of fidelity that would otherwise require thousands of hours of painstaking work.
It’s still early days for these features but, as they mature, they could make it possible for smaller teams to achieve results normally reserved for big-budget projects.
Games aren't going full-AI, yet
Despite NVIDIA’s huge investment in AI, even Jensen Huang says games will never be rendered entirely by AI. Native rendering is still required to give neural rendering the “context” to generate convincing results.
And it will take time for NVIDIA’s aggressive plans for neural rendering to go mainstream. DLSS 4’s multi-frame generation looks impressive, but only 75 games will support it at launch, which is more than DLSS 3 at launch, but still will take time to go mainstream. Other features, like RTX Neural Materials and RTX Neural Faces, are even earlier in development.
I think NVIDIA’s push for ray tracing gives a good example of what the timetable for neural rendering could look like. The RTX 20-series brought ray tracing to gamers over six years ago, but it’s still optional in nearly all modern games. It’ll take years for game developers to embrace the latest neural rendering techniques, too. But make no mistake, neural rendering is the future of 3D graphics.
More from Micro Center News:
- Everything We've Seen at CES 2025: Laptops, All-In-Ones, GPUs, Handhelds, Monitors, and More
- Four Big PC Gaming Themes for 2025
- Tech You Might Have Missed at CES 2025
- 2025: The Best Creator Tech We Saw