cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

Nvidia RTX Ray Tracing Is Incredibly Expensive in Remedy’s Northlight Engine Demo

Ever since Nvidia launched Turing, enthusiasts and the press have been grappling with one question above all others: How fast will the RTX 2080 Ti, 2080, and 2070 be when running games that combine Nvidia’s new RTX ray tracing technology with conventional rasterization? Early reports have not been positive in this regard, with various reviewers noting that the Battlefield V demo was locked to 1080p and struggled to hold that frame rate when running on an RTX 2080 Ti with ray tracing enabled. But it’s also early days for the technology, and better driver and game engine support could make a significant difference in overall performance.

That’s still the case, to be sure, but a demo shown at GTC Europe by Remedy highlighted just how heavy the workload from ray tracing can be. Golem.de has details on the presentation, while a video of the demo can be seen below:

Experiments with DirectX Raytracing in Northlight - YouTube

Notice the heavy shimmering in the first part of the video? That’s caused by only firing two rays per pixel. It’s a type of undersampling error that we first discussed six years ago, in our write-up on ray tracing and its potential future in gaming graphics. When we tested Nvidia’s iray software in 3DS Max, there’s a marked difference between rendering a scene with 300 iterations and 3,000 (in this case, the scene is rendered repeatedly to generate the final result, but you can see a similar visual effect at work, even though the ray tracing methods are themselves different):

Unfortunately, even firing two beams per pixel — which is what the Northlight demo uses — carries a heavy performance penalty. Golem.de attended GTC Europe and Remedy’s talk, as noted above, and they report the following performance characteristics for the Northlight demo running on an RTX 2080 Ti:

the contact and sun shadows calculated with two beams per pixel, including noise rejection, together require 2.3 ms per frame and the reflections a whopping 4.4 ms. Global denoising lighting extends the rendering process by another 2.5ms. This is a total of 9.2ms per frame.

There’s an inverse relationship between a GPU’s frame time (the amount of time, in milliseconds, it takes to draw a frame) and its frame rate. Maintaining 60fps requires the GPU to render and display one frame every 16.6ms. 30fps requires a new frame every 33.3ms. A 9.2ms performance penalty for ray tracing is 57.5 percent of the entire rendering budget. Even with a 30fps target, 9.2ms represents nearly a full third of the GPUs rendering time. And while some parts of the above demo are gorgeous, the shimmering effect caused by undersampling detracts from the entire scene. Asking a developer to devote 27-57 percent of their frame time to just RTX could prove a very tough ask, especially considering that’s the RTX 2080 Ti’s level of performance. A GPU like the RTX 2070, which has barely half the RTX 2080 Ti’s resources for ray tracing, would find this a particular slog.

There are some caveats to keep in mind. This is a demo, not a shipping title, and the demo was designed to show off Nvidia’s ray tracing technology, not for maximum performance. But the fact that it opens with an undersampled shimmering scene to start with also suggests that this represented some kind of compromise — if Remedy had been able to remove the effect without harming performance further they surely would’ve done so, if only to make the demo that much more impressive. Meanwhile, Battlefield V seems to demonstrate that at least the RTX 2080 Ti is capable of integrating impressive ray traced visual effects without tanking the frame rate, even if 60ish fps at 1080p isn’t much to write home about by conventional rasterization standards. Clearly, developers have at least some flexibility to tailor their RTX implementations to the frame rates and experiences they wish to target.

At the same time, however, the little evidence we do have is all stacking up on the wrong side of the equation. Nvidia is the only player in this game that could clear up the confusion, but the company has released no performance estimates or clarifying remarks. We don’t know how much fine-grained control users will have to apply ray tracing selectively or even which GPU to recommend to customers who want to take advantage of it.

If you’re wondering why many in the tech press are so lackluster on Turing, it’s not just because Nvidia launched a GPU when games that take advantage of its features haven’t been released. It’s also because the company has denied the press and public any opportunity to objectively evaluate the value of these features. It’s been two months since the RTX family debuted. It is not unreasonable, at this point, for Nvidia to issue a clarifying statement along the lines of: “We expect the RTX 2070 to be capable of delivering the next-generation visual effects our customers expect while maintaining smooth, playable frame rates and will work with developers to ensure all RTX customers can take advantage of these features.” That doesn’t lock the company in to a specific frame rate. It doesn’t guarantee 60fps. It just says “You’re going to get to use the features we’re asking you to pay at least $500 for.”

Nvidia has asked customers to accept significant price increases because, to hear the company tell it, the RTX 2070, 2080, and 2080 Ti will deliver ray traced gaming experiences that justify the premium. Under the circumstances, it’s reasonable to ask the company to demonstrate or pledge that the RTX 2070 will be capable of both ray tracing and reasonable frame rates.

Nvidia RTX Ray Tracing Is Incredibly Expensive in Remedy's Northlight Engine Demo - ExtremeTech

0 Likes
24 Replies

Think it's crazy to expect -just- 1080p60 from a $800 card. nVidia's dug themselves into a massive hole, but without AMD to compete against the RTX series until at least 2020, they don't have to dig themselves out.

0 Likes

I am skeptical of the 1080p60 claim. I swear I read that even the 2080 ti struggles like upper 30s or something, or maybe that was talking about the 2080. Either way, ray tracing is a paper feature release.

0 Likes

I saw that. Not a fan of SLI or crossfire. Just another reason why to never use it. Between the two, I thought crossfire was much more useful and compatible. In my experience, SLI wasn't supported by many games. When it wasn't supported, it was a matter of googling which special bits to turn on. Conversely, crossfire was a simple matter of seeing how 2 or 3 rendering options worked, that was a long time ago though, with 78xx series cards.

I've owned two dual GPU cards in the past, the 4850x2 and the 5970, and I'm never going to consider Crossfire again. When it works it's great, but just dealing with the inconsistent scaling and performance, as well as issues which can result, such as microstuttering, are just not worth it. Not to mention the extra heat, noise, and cost...

My aging R9 290 recently gave up the ghost and I got a good deal on a R9 295x2. In less than a week, I traded it to my son-in-law for a fairly new R9 290X. Never had so much trouble with a AMD card...I should have remembered/taken my own advice....buy one card that can do what you need it to do and never buy anything that would require crossfire.

0 Likes

The downside being that with prices so inflated both due to memory shortages and no competition, the "one card" option that would guarantee 1080p60 full details is either $550 for Vega64, $500 for GTX 1080, or $700 for GTX 1080Ti, well over twice the price of two RX 580 ($450), despite Vega64 only being 60% faster than an RX 580.

0 Likes

"well over twice the price of two RX 580" ...with twice the hassle.

0 Likes

And makes you feel like Porky Pig by 10 o clock.

See the source image

LOL...I'm slow...took 6 days

0 Likes

Many of those SLI and Crossfire woes were due to the driver side implementation.  The graphics vendors had to design profiles for each game in order for the functionality to be there.  That was slated to be improved in the MultiGPU future with DX12, as now the developers could provide the functionality directly in their engines.  The thinking was, by adding it as a API feature, support would be much more widely available.  The truth however, was that developers have been pretty slow to enable that functionality.  The feature was missing from the Vulkan API, so Linux releases were unsupported as well as consoles and mobile which field a single GPU.

Recently (March 2018), Vulkan also added MultiGPU support at the API level.  So, I think moving forward, as developers refresh their engines, it will be easier to add in MultiGPU support regardless of platform.  Then the functionality will simply work if the user has more than one GPU.

0 Likes

You can get a Vega 64 for well below $550 these days!

pastedImage_0.png

0 Likes

Nobody in their right mind would -ever- buy an air cooled reference edition Vega64, unless they're deaf and like dealing with lower performance due to thermal throttling.

0 Likes

Well that is the only version, beyond the Strix (Arez?), that EKWB makes a water block for.

0 Likes

0 Likes

$479.00 is a bad deal for a non reference board.  Interestingly, my local shop has the Powercolor version at $499.00 as opposed to $599.00 listed on Newegg.

0 Likes

That should say, "Isn't" a bad deal as opposed to "is".  For some reason I can't edit the reply.

0 Likes

You have to view the actual thread and find the post to edit it. For some reason, edit doesn't work from the dashboard or notifications listing.

But it is a bad deal compared to the competition. The GTX 1080 is faster (and cooler, and much more efficient), and is only $10 more (or $10 less with rebate), while the 1070 Ti, which is -slightly- slower (nothing that will make or break playability) is a massive $60 cheaper (or $90 with rebate). Lisa Su is refusing to position Radeon Vega where people will buy it, instead focusing on the enterprise market, and we as non enterprise users suffer high prices as a result, even worse because Vega's replacement is still two years away. This is also why the RTX series costs so much, nVidia has no reason to cut prices on previous generation hardware.

0 Likes

I regularly wonder if they are even making Vega any more, because it doesn't seem like there is much stock or variety out there. Perhaps that is just a reflection of the non-competitive nature of Vega. $449 for reference Vega 64 is a step in the right direction as it puts in line with 1070 ti and 1080 prices.

0 Likes

Likely they're not. Back during the cryptocurrency boom the board partners were complaining they weren't receiving any Vega shipments from AMD, as they were pouring them all into the ungodly expensive Frontier Edition. Considering the high prices of HBM2, 7nm Vega up and running in AMD's labs getting ready for mass production, and the high VRAM capacity of 7nm Vega cards (up to 32GB) meaning AMD is likely accumulating as much as possible so they can use a SINGLE assembler and not face the same quality issues they face with Vega, I really doubt any more are being made, but there are likely more than enough of them collecting dust in the warehouses to keep them in stock through 2020.

0 Likes

the "one card" option that would guarantee 1080p60 full details is either $550 for Vega64, $500 for GTX 1080, or $700 for GTX 1080Ti

So to revise this statement to $480 for Vega64 ($450 reference), $430 for a GTX 1080pastedImage_0.png

"and we as non enterprise users suffer high prices as a result"

This is actually cheapest I can recall a Vega64 selling for.

"nVidia has no reason to cut prices on previous generation hardware."

And yet, the prices appear to be going down.  They went down $80 for both Vega and the GTX 1080 since your original post was made.

0 Likes

I had a 3870 X2.  I haven't ever run with two cards directly in crossfire or SLI.  The main reason I guess, was that the single cards always maxed out the refresh rate of my display at the resolution I had.  Meaning more FPS would just be lost anyway.  By the time games cam out that would drop my card below 60fps, another better single card was always available.

0 Likes

" Not to mention the extra heat, noise, and cost..."  And also, in the case of Vega, power draw.  For a single Vega 64 air, 750W is recommended, so one can assume that with two and some overclocking you could actually be pushing 1kW in power draw.  Which is pretty crazy, considering that the 80% load of a standard 15amp home circuit is 1440W.  Those numbers are starting to approach the range where a dedicated circuit for a PC might be needed, or at least an upgrade to 20amps.

0 Likes