cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

Radeon VII Lands: SLOWER than the more efficient RTX 2080

No surprises here. It's hot, it's loud, it's not as fast as the RTX 2080, and, most importantly, it's not any cheaper than the RTX 2080.

What a letdown...Then again it's another reason you should never fall for any hype on any electronic item...

https://www.tomshardware.com/reviews/amd-radeon-vii-vega-20-7nm,5977.html

13 Replies

There is ONE exception, and it might be a listing error, but the XFX version is listed at $600 instead of $700. Still cost you more in the long run, but it is 14% less expensive than the RTX 2080 for only 7% less performance, still averaging over 60FPS at 4K.

0 Likes
ajlueke
Grandmaster

Wow, they did pair it with a 1/4 FP64 rate.  That is only half of the 1/2 found in the Instinct version and 4 times any consumer GPU since Hawaii in 2013.  It also supports the INT4 and INT8 instruction sets like the RTX Titan.  That virtually guarantees it will be the prosumer card of choice.  16Gb falls short of the RTX Titan, but you get 2/3 of the gaming performance and memory, plus better FP64 throughput for a quarter of the price.

I actually can't believe they left the FP64 rate that high.  No gamer will be able to find this card even if they wanted to buy one.

0 Likes

Especially since AMD’s Director of Product Marketing Sasa Marinkovic specifically stated it would run at 1:16

https://techgage.com/news/radeon-vii-caps-fp64-performance/

Looks like that was Techgage's assumption.  I guess I can see why they made it since the quote was "Radeon VII does not have double precision enabled".  Curious as to why AMD changed their minds and lifted the rate to 1:4.  It does help justify the cards price, it is slightly slower than the RTX 2080 with worse power efficiency and no support for deep learning anti aliasing methods, or Ray tracing, but instead has twice the VRAM and 4 times the FP64 throughput.

0 Likes

My guess is they knew how it would perform compared to the RTX 2080 so it was a last minute decision after the answer from Sansa Marinkovic, and seeing how well nVidia's VESA Adaptive Sync (Freesync) works. Without the cryptocurrency market and FP64, and higher power draw with lower performance in games, Vega II makes absolutely zero sense, but with FP64 it does fill a gap in the market.

But ExtremeTech is right, now is not the time to buy any graphics card unless you absolutely have to. nVidia is, as is typical, gouging, AMD is flailing until Navi is released (and even then nobody knows its performance), and prices are still elevated due to the RAM shortage (artificially created). You could also say they're inflated because the secondary market doesn't really exist again yet due to the number of modified miner cards (rightfully) scaring people away from buying used cards.

But the biggest reason of all might be that quite possibly the most popular computer games right now, Fortnite and World of Warcraft, don't require a lot (relatively) of graphics grunt, and newer games, like Far Cry 5 and Apex Legends, follow the same trend, with an R9 280X and R9 290, respectively, being the recommended cards for max details, so there is no reason to plop down money on a new card.

The adaptive sync addition to NVidia is a good point.  This was the first release AMD had without that plum to fall back on, so as you say, they had to add something to help the GPU stand out from the RTX competition.

The point about the current gaming environment is also well made.  There explosion of battle royal style games as developers try to cash in on the e-sports craze.  An those type of fast twitch games tend not to be visually demanding.  Meanwhile, games like Farcry 5 are actually more likely to become CPU bound than GPU bound.  You can see a bigger improvement in FPS with a new CPU in some of these titles than a GPU upgrade. 

0 Likes

AMD, Nvidia Have Launched the Least-Appealing GPU Upgrades in History

AMD, Nvidia Have Launched the Least-Appealing GPU Upgrades in History - ExtremeTech 

0 Likes

I think the problem is user expectations.  User expect that subsequent generations to give more traditional performance vs the previous generation and are disappointed when they don't get it.  But at the same time, the RTX series isn't giving you "less" for your money.  On average the RTX 2070 and RTX 2080 perform faster than the GTX 1080 and GTX 1080 Ti, while costing the same.  But, the RTX series adds in things like RT core for ray-tracing, tensor cores for deep learning, and is also the first NVidia architecture to support rapid-packed math.  By adding in these features, you give developers a reason to add them to their engines and improve graphics performance in the long run.

Isn't that what we as enthusiasts want?  To see hardware makers and developers push the technology further?  If we don't support vendors when they try things like this, aren't we doomed to stagnation?  

semitope
Adept II

mix of results depending on the game. A lot of games are of the not so good performing variety tho. Some benches have it destroying hte 2080 in hitman 2, battlefield and similar games. 

0 Likes

Um, no it doesn't, a few FPS is within the margin of error.

Battlefield V - FPS - 2560x1440, DX12 Ultra

0 Likes

It bests all NVidia GPUs if you turn CMAA on in Dirt 4.

Snap1.jpg

0 Likes