AnsweredAssumed Answered

Performance Drop of 40-50 FPS Navi 5700

Question asked by jeffa88 on Nov 29, 2019
Latest reply on Dec 4, 2019 by spardasieg

First, let me state I had no issues with the R 200/300/400/and 500 series cards.  I used them all the time to make budget builds for friends looking to get into PC gaming for the first time.

 

For years when it was ATI, I was using the HD series - again no issues.

 

When nVidia came out with the 900 series GTX, I made a switch then for my own personal computer.  It was more on whim/tired of using the same brand(I am one of those weird consumers).  Never had any issues with the GTX 960.  Two years ago, I bought a used GTX 1060 3GB for $100 - still worked great to the day I pulled it 3 days ago.  It just couldn't pull high fps in single player 1080p high/ultra and in some BRs even with low settings(usually around90 fps/but 144hz monitor).

 

So, I bite at a RX 5700(not xt) on sale for $279 by VisionTek (made by AMD).  It looked good, reviews made it sound good, some friends I have online said it was worth the price.

 

This card has been almost a disaster.

 

I cannot run 144hz.   How does a card ship that cannot run 144hz?  Nothing but screen flicker like crazy if I try to use 144hz.  I can use 120hz just fine though.  It doesn't matter if there is no difference to the eye - a GTX 950/960/and 1060 all ran this same 144hz monitor without issue and with 2 other monitors plugged in as well.

 

Next problem - can't run MSI Afterburner to even monitor the card because screen flicker... What?  A monitoring program(extremely popular one at that) should not be causing a gpu to flicker.  Whatever, swap to WattMan + GPUZ I guess to monitor the card.

 

Then that's when I noticed a problem when I started to play games: the wattage.  What in the ever loving F were you thinking, AMD?  A power-house mid/high range card that fluctuates at will during gameplay?  Why?  I've never had that issue on ANY nVidia card.  Buddy wanted me to play some PUBG with him.  I had the settings on low but with ultra draw distance and such to maintain a fluid 120fps(since I have to play on 120hz).

 

Everything seemed fine then I started getting severe 40-50fps drops for no reason.  I began to watch WattMan during this.  The card was pulling 60-70watt for a 180w card and it dipped to 30watts during these FPS drops.  What?  First, why is a 180w card only pulling 70w during gaming?  Desktop/Windows basic stuff - I can understand that.

 

So, I crank PUBG to Ultra.  I get 100-120fps depending and I notice the card now pulling 120watts with it dropping down to 70w which also causes fps drop to 50.  Why?  Why does it use less power for seemingly no reason?  Keep in mid this entire time, my temps are under 60c because of my cooling settings - it's not throttling due to temps.

 

Other things of interesting note compared directly to the GTX 1060, a far inferior card according to benches:

-I can mine crypto and game at the same time without seeing any FPS impact(was done on accident a few times)

            can't on the RX5700

-some information programs keep saying my card has 18compute units but AMD clearly states it is supposed to have twice that - what?

Outcomes