When was dlss introduced
Content on WhatAnswers is provided "as is" for informational purposes. While we strive for accuracy, we make no guarantees. Content is AI-assisted and should not be used as professional advice.
Last updated: April 17, 2026
Key Facts
- DLSS was officially launched on <strong>February 26, 2019</strong> with the GeForce RTX 2060.
- It leverages <strong>AI and deep learning</strong> on NVIDIA's Tensor Cores for image upscaling.
- The first version supported only a handful of titles, including <strong>Control</strong> and <strong>Wolfenstein: Youngblood</strong>.
- DLSS uses a <strong>super-resolution neural network</strong> trained on high-resolution image data.
- By 2023, DLSS was implemented in over <strong>300 games</strong> and applications.
Overview
DLSS, or Deep Learning Super Sampling, is NVIDIA's proprietary AI rendering technology that enhances gaming performance and image quality. It was first unveiled on February 26, 2019, as a cornerstone feature of the GeForce RTX 20-series GPUs based on the Turing architecture. The technology marked a significant shift in real-time graphics by using machine learning to upscale lower-resolution images to higher resolutions.
Unlike traditional upscaling methods, DLSS leverages deep neural networks trained on high-resolution image data to reconstruct pixels intelligently. This allows games to render at lower internal resolutions while outputting sharp, high-resolution visuals. Over time, DLSS has evolved through multiple iterations, significantly improving performance and visual fidelity across a growing library of supported games.
- DLSS launched on February 26, 2019 with the release of the GeForce RTX 2060, marking the first consumer GPU to support AI-driven upscaling.
- The technology relies on Tensor Cores, specialized hardware units in RTX GPUs that accelerate AI inference tasks like image reconstruction.
- Initial adoption was limited, with only a few titles such as Control and Wolfenstein: Youngblood supporting DLSS at launch.
- DLSS uses a deep neural network trained on 16K-resolution images to learn how to upscale lower-resolution frames accurately.
- Version 1.0 required per-game training and was criticized for artifacts, but later versions improved dramatically with dynamic models and broader compatibility.
How It Works
DLSS operates by rendering a game at a lower resolution and then using AI to upscale the image to the display’s native resolution. This process happens in real time using NVIDIA’s deep learning algorithms, which predict and fill in missing pixels based on temporal data, motion vectors, and prior frames.
- Input Resolution: The game renders at a lower resolution (e.g., 1080p) to improve frame rates. This base image is fed into the DLSS network for processing.
- Tensor Cores: These specialized units on RTX GPUs perform matrix math at high speed, enabling efficient AI inference needed for real-time upscaling.
- Neural Network Model: A deep learning model trained on NVIDIA’s supercomputers analyzes the low-res frame and reconstructs a high-res output with minimal quality loss.
- Temporal Feedback: DLSS uses data from previous frames and motion vectors to maintain consistency and reduce flickering or ghosting in dynamic scenes.
- Super Resolution: The final step upscales the image to the target resolution (e.g., 1440p or 4K), delivering performance gains of up to 2x frame rates in supported titles.
- Dynamic Training: Starting with DLSS 2.0, the model became generalized, eliminating the need for per-game training and enabling faster integration into new games.
Comparison at a Glance
Here's how DLSS compares to other upscaling technologies in key performance and quality metrics:
| Technology | Developer | Launch Year | Performance Boost | Key Requirement |
|---|---|---|---|---|
| DLSS | NVIDIA | 2019 | Up to 2.5x | RTX GPU with Tensor Cores |
| FidelityFX Super Resolution (FSR) | AMD | 2021 | Up to 2.0x | Any GPU (open source) |
| XeSS | Intel | 2022 | Up to 1.8x | Intel Arc GPUs or AI acceleration |
| TAAU | Custom implementations | 2010s | Minimal | Standard rendering pipeline |
| NA | No upscaling | N/A | Baseline | Native resolution rendering |
While DLSS offers the highest image quality and performance gains, it is limited to NVIDIA hardware. Competing technologies like AMD’s FSR aim to provide similar benefits but without requiring dedicated AI hardware, making them more accessible across GPU brands. However, DLSS consistently ranks higher in image clarity and temporal stability due to its AI-driven approach and extensive training data.
Why It Matters
DLSS has become a game-changer in modern gaming, enabling high frame rates at 4K resolution without sacrificing visual fidelity. Its success has pushed competitors to develop similar AI-enhanced upscaling solutions, accelerating innovation across the industry.
- Enables 4K gaming on mid-tier GPUs by rendering at 1080p or 1440p and upscaling intelligently.
- Reduces GPU load, leading to lower power consumption and heat output during gameplay.
- Supports ray tracing effects without crippling performance, making realistic lighting more accessible.
- Used in over 300 games by 2023, including major titles like Cyberpunk 2077 and Spider-Man: Miles Morales.
- DLSS 3 introduced frame generation in 2022, further doubling performance on RTX 40-series GPUs.
- Has influenced game development pipelines, with studios now designing rendering strategies around AI upscaling.
DLSS has redefined expectations for real-time graphics, proving that AI can enhance both performance and visual quality simultaneously. As machine learning continues to evolve, technologies like DLSS will likely become standard in future gaming ecosystems.
More When Was in Daily Life
Also in Daily Life
More "When Was" Questions
Trending on WhatAnswers
Browse by Topic
Browse by Question Type
Sources
- WikipediaCC-BY-SA-4.0
Missing an answer?
Suggest a question and we'll generate an answer for it.