cancel
Showing results for 
Search instead for 
Did you mean: 

Graphics Cards

Bacchin
Adept I

The future of the RX 6800XT graphics card

I'm building a computer all turned to the AMD profile - Ryzen 9 5900X processor, MOBO X570, G.Skill Trident Z Neo 3800mHz CAS14 memory, and RX 6800XT video card. I haven't bought everything yet, but after watching several videos of reviews, benhmarks and tests of this card against the RTX 3080, I was in doubt if it is worth investing in the 6800XT.


In all tests, this card works at a higher frequency, the memory speed is lower, the temperature is practically 10 degrees celsius higher than the competitor. In raw processing, without ray tracing, it matches or is superior (although I think there must be something wrong to be corrected with this temperature difference). But when the ray tracing is turned on, the 3080 shows its superiority. If you turn on DLSS then the dispute is over. The processing difference is monstrous.


To correct and improve what we have today, the company updates the card's drivers.


My question is: if AMD develops a technology that is similar to DLSS (and works), and that improves its ray tracing to face the competitor, that technology can be incorporated into the hardware via driver update or can only be included in new hardware (in the next generation of video cards, for example)? I was in doubt because game developers have to create games taking into account new technologies, even more with all the processing power of the Xbox Series X to be exploited. What if created for him will automatically be seen in PC games, running on the same optimization?

0 Likes
12 Replies
Thanny
Miniboss

Since there are only a few games with RTRT and DLSS (which seriously degrades quality), and you appear to care a great deal about each of them, why did you go with the 6800 XT?

Whatever AMD does to "compete" with DLSS - beyond the obvious existing option of running at a lower resolution, just like DLSS does, and scaling the output up with some sharpening - it's supposed to be broadly applicable, which means not ML-based.  Which also means free of the artifacts that DLSS has.  How well it will work remains to be seen.

As for improvements in RTRT, that depends on the game developers mostly.  No game has yet struck a reasonable balance between visual improvements and performance on any card, so it's hard to guess how it will play out with the 6800 XT specifically.

A few means 3, the 51 games on the list of games with DLSS doesn't indicate a few. Plus tons more are slated in near future. Most tech sites agree that DLSS or coming comparable tech is the future including RT. DLSS is helping a lot of mid level card users play at resolutions they could not otherwise. The tech AMD is working on is beleived to be in conjucttion with what will be the Direct X standard and that is Direct ML, machine learning, but yes that doesn't mean the same as Nvidias DLSS 1 or 2. It will be more of an on the fly approach. Nvidia is doing the same with their 3.0. It would be nice if they would all work on an open standard together. 

Maybe you have missed the many reviews with Cyberpunk and other games showing that DLSS quality mode actually improves quality in some scenarios vs native and yes in others very marginally can decrease or mostly be the same with no real difference which is the goal. 

Your statement was true with DLSS 1.0 3 years ago when only a couple games did support it and didn't do it well. That is hardly the case now. 

While if you are a enthusiast buy GPU buyer you probably want the very best native rasterization but realistically when the market moves on to beyond 4k 60, DLSS type tech will be the only way the path forward is even feasible. 

0 Likes

I've certainly seen the claims that DLSS 2.0 provides "better" quality than native, but those claims are all incorrect.  If you're deviating from the correct render (which you can see at native resolution), then it's not better, by definition.  It's different, and some people may decide they find the incorrect render more aesthetically pleasing, but that's not better, because it's wrong.

Furthermore, there are still plenty of instances where the results are not only incorrect, but aesthetically bad.  And then there are the temporal artifacts, causing flickering textures, which makes DLSS entirely unacceptable to me.  It seems to me that DLSS is like screen tearing.  Some people have glass eyes and can't see how terrible it looks, and continue to play without vsync or adaptive sync enabled.  For me, screen tearing makes the game unplayable.  And the way DLSS looks - even DLSS 2.0 - makes the game look likewise unplayable.

As for the game count, there are 17 with DLSS 2.0+ out, and only a handful of them are titles anyone is likely to have heard of.  That's a few games, not a lot of games.  DLSS 1.0 didn't exist 3 years ago, so not sure where you got that notion from.

Finally, I disagree completely about the need for DLSS or something like it, if that means ML-based scaling.  It will never create the correct render, and it will always have artifacts.  There are some non-ML methods that might produce good results when upscaling from a lower resolution, but we'll have to see what AMD does.  I don't think they're using DirectML, based on the most recent statements made.  They might have explored that option before, but I think they're going in a different direction.

 

0 Likes

You are looking at the past instead of the future. The new generation of consoles (and consequently will dictate the future of games in this generation) has as its marketing and main technologies the possibility of games in 4K 60fps (or more), using raytracing technology, and this will be the great leap of this generation. The gaming benchmark (at least for the Xbox Series X) will be 4K 60 with raytracing - which the 6800XT cannot do today.
Like it or not, NVidia is more advanced in this technology, and DLSS has a great differential. It is useless to say that there are few games, and even at the launch of the new generation of video games there was no game developed for the new consoles. None so far. Who used this technology the most was Cyberpunk (and I think Watch Dogs would be second). But just follow the trailers of the games that will be released: raytracing will be used on a large scale, and hardly a game will not use such technology.
Even without using DLSS, raytracing games run better on the competitor. The Control game using raytracing has a performance similar to that of the 3060Ti card. When the DLSS is turned on, it exceeds 6800XT, and the image is not so badly damaged. If it stayed, review channels (both small and large) would have already talked about it. On the contrary, so far the DLSS has made a difference.
Even for me, the fact that in the same game, the 6800XT works with a clock higher than the 3080, has more VRAM, operates at a temperature of 10º celsius above and when the raytracing is turned on, the performance loss is very great. If with adjustments in the driver AMD over time manages to approach or match NVIDIA it will be worth the investment. If not, if this upgrade is only possible by replacing hardware (RX 7000 series), I don't see that much advantage even though it costs less. Anyone who wants a high end card wants the high end features that the generation can offer.

0 Likes

The notion that even a majority of games will do RTRT in the next couple years is pretty ridiculous.  Even more ridiculous when you consider the fact that not one game so far has produced a visual quality improvement that justifies the performance loss.

DLSS is not a viable solution, even ignoring the poor visual quality.  It must be baked into the game explicitly, which means no random developer can just add it to the game.  You have to go through nVidia.

I'm not going to comment further, since it's clear you're shilling for nVidia, for whatever motivation you might have.

I am a consumer, not a branded fanboy.

0 Likes
fyrel
Miniboss

Ray tracing won't be mainstream until graphics card are fast enough to do the rendering without having to upscale lower resolutions.

DLSS and whatever implantation AMD comes up with are just stopgaps for now to try and get raytracing off the ground.

I personally haven't seen any game where the raytracing improves the graphical look of the game enough to warrant the lose of performance it bring. I'm sure it will get there but for now it like any new technology when it first comes out and just for show.

Probably fair to say that no card today can run RT at acceptable frame rates - none. That we do know. About the future of any Nvidia 3xxx or AMD 6xxx, they may be long or short lived but we just don't know. But RT should not be a top factor when deciding between the two cards today regardless of what Nvidia's marketing is telling you. And you know Nvidia has been hyping RT now for almost 3 years starting with the 2xxx series which, frankly, needed the hype.

DLSS is a trick and it works in the sense of many folks like the result but then again many others do not. I'm personally not a fan of it. Again, I certainly would not make a buying decision on this feature.

0 Likes
LeyvinBeta
Adept I

To the question of "Will the RX 6800 XT be worth the investment?"... as much as I'd like to say yes, the problem for me is the Price Point.

I'm sorry but at present:

RX 6800 - £600
RX 6800 XT - £680
RX 6900 XT - £1,100

Those are ridiculous prices.

I mean keep in mind we went from

R9 280X - £300 (2013)
R9 380X - £230 (2014)
RX 480 - £180 - 220 (2015)
RX 580 - £200 - 240 (2016)
RX 5700 XT - £380 (2018)
RX 6800 - £580 (2019)

Now I'm citing each of these because they are essentially the EXACT same Class of Card., specifically the classic designation is "Mainstream High-End"

As a note., I'd be perhaps less critical here IF the RX 6800 was a 54CU Variant for say £399.99., while the 60CU Variant was the RX 6800 XT priced at £479.99.
Especially if this was also paired with an RX 6700 (36CU - £279.99) and RX 6700 XT (40CU - £319.99) Variants to replace the RX 5700 Series. 

But that isn't what we're seeing. 
From my perspective the RX 6800 XT is actually an RX 6900... it's just be re-branded in a way to "appear" to be the competitor to the RTX 3080, in AMD's continued (and misguided) attempt to mimic the naming conventions of their Competitors. 
This greatly frustrates me... and in blunt terms, who does this ACTUALLY Fool?

The names aren't close enough for those without Technical Knowledge to understand that they're even supposed to be competing against each other; and those that do, care more about the benchmarks than the name printed on the side.
So ultimately what this does is make AMD (to the Casual Consumer) appear to be a cheap knock-off of a brand that they're more well versed with.

Pricing them competitively also is a mistake. NVIDIA has the Branding _THAT_ is what commands the Higher Premium., being able to equal or even beat them in performance doesn't change that... people still buy iPhones over Galaxy despite the Galaxy often having FAR higher performance and specifications. 

Now the other mistake on display right now is that as it stands AMD already have a working variant of DLSS., it's part of the FidelityFX Contrast Adaptive Sharpening middleware... technically it's added to a variety of games, but at present to my knowledge ONLY Death Stranding and Horizon Zero Dawn actually implement it; even then it's in a pre-release version that lacks most of the features and quality controls that the Released version has.

Is it as "Good" as DLSS? Eh.. that's subjective.
I'd argue that Native is better because it's simply a Clearer Image with more Detail; and this is especially true for "In-Motion"... but then I do see the appeal of DLSS being more Aesthetically Pleasing without the same Gaussian Blur you typically see from Upscaling. 

Well that, and because DLSS essentially is built upon the TXAA Concept... means it is an excellent Anti-Aliasing approach that is far better than any other approach. Now from NVIDIA's perspective it's worth the Hardware., because it's running the game at a Lower Resolution it gives the appearance of better performance; but does it really compare to Native? No. 



Now in regards to Ray Acceleration... well this is where things get more interesting.
As it stands every Ray Tracing Enabled Title (even if they use DXR)., is designed and developed around the NVIDIA Solution.
AMD have done well to be able to support and emulate that in a few games so far... but the reality is that AMD and NVIDIA approaches to Ray Acceleration is VERY different.

AMD's approach isn't a "Complete" Solution... it isn't designed or even intended to be a replacement Architecture, and believe me that what NVIDIA plans to do is slowly phase out CUDA in favour of RT Cores., and have pure Ray Tracing Hardware.
How this differs in AMD's solution, is that as I said; it isn't a Complete Solution; it isn't designed to work as a replacement but rather is literally designed to accelerate the processing of Ray Mathematics as part of a "Hybrid" Pipeline.

The other thing to keep in mind is that AMD's BVH Traversal is Hardware., where-as NVIDIA's uses a Software-Hardware approach... specifically it uses Tensor and CUDA to accelerate the task, but ultimately it's still a Compute Shader. 

And this is where AMD Hardware can see exceptional performance... as BVH is an approach to reduce the number of Rays needed to be cast for a given area. (more specifically reduce the number of wasted Rays to only what is needed)
When games begin to get developed to specifically take advantage of this; well the lower overall Ray Performance (as remember it's used inline with Compute Shaders as opposed to Parallel to them); means a much more efficient and faster approach to Ray Solutions.

This doesn't just mean in regards to Visual things like Shadows, Reflections, Global Illumination... but stuff such-as Audio Physics, Material Physics or Artificial Intelligence; that can simply be augmented with such, rather than replaced with a Ray Solution. 

Now are these worth getting now? No

I mean you asked... and no, I don't believe for a second that either the RTX 30-Series or RX 6000-Series are worth their MSRP; let alone the inflated costs that are stemming from low supply and exceptionally high demand. 

Most of the Features showcased by both are either very lightly supported, semi-supported or are just not really going to be implemented widely for another 12-18 months (at least)... at which point we'll have more powerful replacements (hopefully with more supply and lower prices).

But then there's another aspect to this.

How this recent success has been changing AMD...
If you have the money to just burn (and are fine with waiting for whenever stock ACTUALLY returns)., well whatever... your money, do as you please with it.

Still, not going to make you feel better about being ripped off by either AMD or NVIDIA.
Heck I'm not even going to say that the RX 6800 XT will actually get better down the road, because I guarantee that regardless of the technology in there... AMD has NEVER had any luck actually convincing developers to support their technology.

As it stands., the Card is Overpriced but Competitive. 
If that remains true in the long-term and if the Ray Tracing Performance will see a real benefit... who knows.

NVIDIA currently has DLSS which provides clear performance benefits in existing titles; where-as AMD has SAM., which can see a similar benefit. Both sides have promised to be working on counters to the other... so in 12 months, no doubt we'll see similar features on both.

NVIDIA's RT Cores do presently provide better performance on RTX Enabled Games., but again, if this remains true given the Xbox Series and PlayStation 5 both use AMD Ray Acceleration; well who knows. I mean it certainly took almost 3 years to see any real benefits stem from them being GCN based., as PC Ports are given to companies that typically go with NVIDIA Gameworks; and well there is an obvious performance advantage to one side there.

I'd say unless AMD takes a big slice of humble pie, drops their prices to something more reasonable; then for performance junkies with no real Credit limits... then NVIDIA is likely to remain the better option.

But then that's going to also come down to what games you prefer to play... there is no "Universal" Best Card.
And the performance gap between NVIDIA / AMD titles has never been bigger. 

Lot's to read there! But this is simply not correct in the context of what the cards can do: "And the performance gap between NVIDIA / AMD titles has never been bigger."

Just have a look at all the reviews - the countless reviews - comparing the latest from Nvidia and AMD performance in the same popular games. They are truly 'neck and neck' for the most part. Again, RT is not a realistic option for most/all gamers today as the fps is way too low unless playing @ 1080 or lower resolutions. And we do not know how the AMD cards will ultimately perform as the drivers are immature.

If it seems like I'm defending AMD a bit against claims in this thread then perhaps yes because the truth is the Nvidia 3xxx series have had 3+ months head start for maturing, and the only areas where they current differentiate by a noticeable margin are in areas that, frankly, are not mainstream. I actually PREFER Nvidia h/w and have for some time. Same like I used to always PREFER Intel over AMD CPUs. But the facts are pretty clear at the moment that the two brands have very compelling and similar video card options. One certainly does not stand out strongly against another.

0 Likes


@Kubicide wrote:

Lot's to read there! But this is simply not correct in the context of what the cards can do: "And the performance gap between NVIDIA / AMD titles has never been bigger."

Just have a look at all the reviews - the countless reviews - comparing the latest from Nvidia and AMD performance in the same popular games. They are truly 'neck and neck' for the most part. Again, RT is not a realistic option for most/all gamers today as the fps is way too low unless playing @ 1080 or lower resolutions. And we do not know how the AMD cards will ultimately perform as the drivers are immature.


Maybe you need to pay closer attention to the Review Metrics., in-fact Hardware Unboxed is an excellent one to review; as they actually show the per-game Delta for all the game they review... and they have done for a while. 

There are insane wins and loses... which sure Avg. out to both cards looking like they trade blows., but they can varying in performance by an entire Tier of Card at times. Historically speaking, typically a card is either clearly faster or slower; and the variation will either get it slightly closer or further... but remain in a "Competitive" Range; with the exceptions of maybe 1-2 outliers in either direction; this Generation however., it REALLY depends on which game you want to play. 

In some regards it could end up excellent value performance., in others it's shockingly bad. 


0 Likes

Interesting. Must be a couple of Hardware Unboxed sites then as the one I know from YouTube show, as example, the 6800 outperforming the 3070 in 35/41 games, and most by significant margins being 10-30% faster. The few games with less performance are either Crytek based engines, which is known and likely a driver thing, or one admittedly old game. But it certainly doesn't suggest anything to statement of the performance gap "...never been bigger". It's quite the opposite; the two brands have never been closer.

Anyway, not worth continuing as it's better the OP answer their question on value based on what they play and what they read from honest reviews, and not singular opinions on a forum.

0 Likes