It's all too typical for nVidia, in my experience. They've now walked it back, because of the tremendous bad publicity it earned them, but the damage is done. It's just too bad AMD doesn't have a big stockpile of graphics cards to sell, because a lot of people would have switched on the spot.
As for the general question with real-time ray-tracing, I think it's a long way off from usability. Less than 5% of gamers spend $500 or more on a graphics card, and those are the only cards that yield barely playable frame rates when RTRT is in use. I'm utterly unimpressed by DLSS, so that wouldn't mitigate the situation at all for me. Why increase the visual quality with RTRT only to degrade it with a messy and artifact-laden AI upscaling algorithm?
Many of the effects have a noticeable impact on visual quality, but not enough to justify the performance cost.
It's really kind of a chicken and egg problem. Games are going to get better looking without RTRT over time regardless, so rasterization performance has to increase. RTRT performance isn't going to go up without more silicon dedicated to specialized processing, which eats into the budget for increasing rasterization performance. And none of this is going to work at all except on very high end cards, which less than 5% of the market will purchase and use. The software needs to come along significantly to justify the hardware changes needed, but the software is only going to trickle in until there's more hardware available that's capable of running it, and at much lower price points.
I don't see the new consoles making much of a dent in this, because the only RTRT they'll be able to do is fairly light.
Maybe things will creep along slowly, gaining momentum, but it seems more likely to me that we need some kind of tipping point, with no obvious path to reach it.
Maybe that's why nVidia is trying to bribe and/or bully the tech press into pushing the RTRT dogma. It's not going to work.
I do think that Hardware Unboxed have often been very dismissive about RT Cores and AI when comparing Nvidia and AMD cards.
However they are entitiled to their opinion in a review.
The vast majority of games do not even support RTX or DLSS 2.0.
Productivity applications, Compute, and Blender not really their focus in reviews.
RE: Barely Playable Frame Rates.
That depends on the resolution you try to run at and the Ray Tracing Effort level.
This video shows 1080p 60FPS is possible on Battlefield 5 DXR on an RTX 2060.
The initial implementation of DLSS did have problems.
DLSS 2.0 is much better now though.
I do not think AMD have any answer to it at all.
I have not seen how the RX6800XT visual quality compares to an RTX2080 running Games with RTX support.
I will look for a review that does a detailed comparison.
I have not purchased an Nvidia 3000 series GPU.
I think it would be better to wait for Nvidia to port them to better process node if people are interested in new Nvidia card.
There are absolutely no "Big Navi" GPUs available to purchase where I have tried.
I have a 4K 60 screen and an RTX 2080OC is enough to run 60FPS frame rate on the games I run and settings I use.
I was surprised AMD did not do something to boost RT processing by using a dedicated Ryzen CPU core maybe an "RT Core".
Perhaps SAM on PCIe 4.0 is still too slow to try something like that.
There are often plenty of CPU cores on higher end Ryzen processors that are unused.
Maybe dropping a CPU core for a dedicated RT core will happen in future.
CPU use only hits ~ 73% on a Ryzen 2700X running at 3.7 GHz on BFV 4K Ultra @ ~ 60FPs with an RX5700XT, for example.
Ryzen 3000 and 5000 series IPC increases mean that % drops for same performance.
Not seen what the RayTracing on new AMD Consoles looks like.
I would have thought consoles should drive more use of the technology if users are impressed by it.