What is Nvidia DLSS (Deep Learning Super Sampling)? I’ll tell you exactly what DLSS is, why it matters, and how to unlock this new technology for yourself.
The new graphics technology on the horizon for GPUs is Deep Learning Super Sampling (DLSS) and it was first announced by Nvidia during GTC last week. DLSS can produce breathtaking visuals for ray traced games and is a significant improvement over standard Super Sampling, from which it is derived. In this article I will explain exactly what DLSS is, how it works, and some of the drawbacks associated with using it.
Mentioned during the Nvidia keynote, DLSS stands for Deep Learning Super-Sampling. It’s a rendering technique which uses AI and deep learning to increase graphical quality without the need for additional hardware. As Nvidia CEO Jensen Huang said during the event, DLSS is “more performance for free”. But what exactly is it?
Nvidia has recently rolled out a new graphics rendering technology called DLSS. With an obvious nod to the world of AI and deep learning, this new algorithm uses deep learning to generate better images in-game.
Nvidia’s latest driver update, released last week, introduces a feature called Deep Learning Super-Sampling (DLSS). It claims to be able to deliver similar fidelity of existing super sampling techniques (ie. Temporal Super Sampling), but with better performance. So what is DLSS and how does it work? At its core, DLSS is a form of AI that trains the neural network on a large number of rendered images, and then uses what it has learnt from the training data to create an estimate of the final rendered image at higher resolution (eg. 4K), with lower calculation requirements than real-time rendering.
Deep learning super-sampling, or DLSS, is one of two new technologies that will be supported in the latest iteration of Nvidia’s top of the line graphics card, the RTX 2080. The other new technology is ray tracing which was demoed recently in a cinematic trailer for upcoming game Metro Exodus. Although ray tracing is getting the headlines, DLSS is actually far more interesting.
DLSS ON vs Normal Gameplay video – Cyberpunk 2077
Nvidia DLSS 2.0 and DLSS 2.1 Improvements
Nvidia has announced some DLSS 2.0 and DLSS 2.1 improvements on their previous version which is DLSS 1.0.
Searching through the GeForce forums, I found this recent post by Nvidia detailing some of the advances made in its Deep Learning Super-Sampling (DLSS) technology and how it imp roves upon DLSS 1.0. Overall DLSS 2.0 and DLSS 2.1 offer a noticeable difference from the original release back in March. We tested Battlefield V using five different graphics presets: Ultra, Ultimate, High, Medium, and Low.
That’s right — some of the games we benchmarked actually got better optimizations in DLSS 2.1 than they did with DLSS 2.0, and over all, DLSS 2.1 is a pretty major improvement over DLSS 2.0 and worth a look, even if you’re not sure if it works on your system.
With the dawn of deep learning AI and the invention of GPU-based graphics in 1983, comes a new display of graphics — Deep Learning Super Sampling (DLSS) . Today’s post is all about comparing DLSS versions 1 and 2 (2.0 and 2.1).
At the end of 2018, Nvidia introduced improved versions of their AI-based Deep Learning Super Sampling (DLSS) algorithm. This post documents the improvements to DLSS and how they improve performance.
Following the launch of Deep Learning Super Sampling 2 (DLSS) around October last year, DLSS got a lot of attention for both the improvements Nvidia brought to the table and the drawbacks. We first talked about DLSS in detail in our blog post, Voxel Super Sampling: A New Approach to Graphics Rendering.
The 2.1 and 2.0 DLSS models are now available as part of the new Driver Package, DLA 16.50, introduced on the February 7th 2019 GeForce Game Ready drivers for some games which support Deep Learning Super Sampling (DLSS). The 2.1 & 2.0 models offer higher resolution of the reconstructed image compared to the 1.0 model while using much less memory and bandwidth, which possibly allows for an improved user experience across a wider range of hardware configurations and GPU memory sizes than before.