At SIGGRAPH a few weeks back, NVIDIA launched their Turing architecture that was slated to revolutionize the creative process using a technology called Multi-Bounce Recursive Ray Tracing. At that launch, they focused on their targeted Quadro RTX cards designed specifically for professional grade animators, graphics artists, architects and others whose job it is to create images—real or imagined. These cards were not a cheap date with prices ranging from a painful $2,300 to a nose bleed inspiring $10K and likely outside the budget of most PC buyers. At these prices, if your company doesn’t buy the card for you then you are likely out of luck. But we did anticipate a lower cost line of cards targeting gamers that would be launched shortly thereafter.
Well, last week NVIDIA announced their GeForce RTX series of GPUs and they didn’t disappoint either. With impressive gamer support and performance that doubled their popular GeForce 1080 product, the new GeForce RTX 2018 set the market back on its heels.
Let’s talk about what makes the GeForce RTX cards unique.
The AI Difference
Now, what I find fascinating about the Turing Architecture is how it uses AI to get higher performance because this isn’t intuitive at all. You see if you don’t use the AI, or DLSS (Deep Learning Super Sampling) technology, the performance increase is about 50 percent greater (still impressive) than their earlier cards. They get the extra 50 percent by rendering in a far lower resolution and then using the DLSS technology to upconvert intelligently to the higher resolution. So, you see the higher resolution and the far faster performance but, behind the scenes, this is because the card isn’t actually running at that resolution.
Best example I can think of in a car is overdrive. Older cars, and some motorcycles (I used to have a Suzuki that had this) used to have a second transmission. That second transmission allowed you to effectively change the gear ratios of your existing transmission making them higher or lower depending on the setting. In my motorcycle this was for on-road and off-road use, lower gears off road, higher gears on-road. So, with the overdrive in place and shifted up you got higher speeds, and better gas mileage at any speed but you did give up torque and acceleration.
But the magic of this card is that, unlike a vehicle, you just get the higher performance without any tradeoff I can yet find—other than the card likely pulls more power with DLSS enabled.
DLSS Magic
Now the other really interesting thing about this card is that, in theory, it could up sample anything. For instance, an old movie that wasn’t shot at 4K that you want to look good on a 4K screen, or an older game that was never designed to work on even an HD monitor could be up-sampled to run in full resolution with a 4K TV or monitor.
Potentially this could even selectively change images say put your kids faces in the movie dynamically without making them look stuck on. Now, I don’t expect to see this offered immediately but that is potentially a capability in the card and this could be a game changer for how and were we hook our TVs and PCs up. It certainly provides a far more compelling reason to have a PC with this card in it hooked to a 4K TV.
Wrapping Up: Creating Unreality
Really, if you think about it, NVIDIA’s Turing cards, both GeForce and Quadro, are moving the ball significantly towards the time when we can have a Ready Player One type experience. When we can go into a VR game and begin to believe that the rendered reality is real. Granted, there is a lot of other things that need to get done other than the video part (motion, sensors that allow us to use our hands to feel like we do in the real world, and a level of immersion that isolates us from the environment around us. But that part of the art is also being advanced. With Turing and GeForce what is possible on your desktop just massively changed, and you know what? This is just the beginning of that change, the next few years should be amazing.
- How to Build the Perfect AI Workstation - November 5, 2024
- IBM Launches Guardium Data Security Center: Well-Timed for High-Risk Sites - October 28, 2024
- Intel and AMD Form x86 Consortium in Advance of NVIDIA’s ARM Challenge - October 19, 2024