Current PC : POWER COLOR RED DEVIL 6800XT, Ryzen5 3600, 16go 3466mhz, ASUS TUF B450 PLUS GAMING, 750W power supply.
I received 2 weeks ago a POWER COLOR RED DEVIL 6800XT, and I saw something is going wrong. When I'm playing COD Warzone, I sometimes have FPS drops. When using MSI afterburner for the monitoring, my GPU has a 65° max temperature, so the problem doesn't come from temperatures. I saw that the FPS drops come from the GPU usage the clock of the GPU who also drops. For example, when the clock is about 2300mhz, the GPU usage is above 95% and I can have 160fps with normal settings in COD. And a sudden off a time, the clock can drop to any number, and i saw my GPU usage dropping to 50% and so my FPS drops to 100-110. A friend of mine bought the exact same card, but instead he has a Ryzen5 5600X and 16go 3600mhz, and he has the exact same problem has me. We tried a lot of things like DDU, drivers desinstallation, install games again.
I did a benchmark on Heaven benchmark, and the same things happen (clock drops and FPS drops).
This is a problem because performance in games are not consistant and this is for me the key to enjoy a game.
Warzone is a problem not the card. Everyone is complaining about FPS drops and poor performance in general with that and other recent COD titles. Dev's need to work on that issue. Even the 3000 series Nvidia's are not so hot with that game.
Yeah but I did a Heaven Benchmark to verify if it was only on COD warzone, and the same thing happens. So I know that warzone has some problems but I don't think that's the case here...
Random suggestion but are you running separate power cables to the card? If you're using an 8pin with splitter to an 8+8 pin that comes with most power supplies this can cause issues. Its best to use 2 separate 8Pin power cables from the PSU. Just a thought off the top of my head...
Have you tried turning on/off Radeon Chill? I believe you can do it in game too, with the default hot key being F11. Try that out and see.
Well Unigine Heaven is more of a stability tester than a FPS indicator for one. Secondly, you're using a tool developed in the exact code of Warzone, C++, very CPU intensive. You're running a 3600, not the hottest CPU with a B450 board also not optimal for AAA gaming. Running a serious game or test means all the hardware needs to be factored as a whole when looking at results and such. It would be like putting that card into a AM3+ FX 8150 system and expecting 200 FPS in any game. Not happening because the card exceeds the power/speed of the CPU, BUS and RAM. You're missing the benefit of PCIe 4.0 which that card takes advantage of.
Try running 3D Mark TimeSpy, Super Position other tests. Try other games. Are you running the test with back round apps or unnecessary Windows services on? Is "enhanced sync" on? Have you tried at least moving the minimum GPU clock to within 100Mhz of the max GPU clock and enabling "fast" VRAM? What monitor specs are we talking about? DP or HDMI connection, refresh rate, GTG response all play a part in the smoothness or perceived FPS drop. Remember these tests weigh heavily on the CPU as well as the GPU, especially single threaded functions that AMD 3000 series CPU's fall behind in.
I've read the Power Color cards, especially the Red Devil's aren't living up to expectations as well. So at the end it could be brand. Very hard to say being the hardware attached. The hardware of the build is just "ok" not top of the line like an X570 and other matching components where diagnosing a performance issue could really be pin pointed to the card or something else.
I've also red posts dating back over a year with Nvidia having FPS drops using that test, so definitely try others. Bottom line if you drop in a high performance GPU expecting miracles using parts that don't totally match, you're in for disappointment. There's a science to building a gaming PC and just slapping high priced parts in doesn't mean the rig will fly. I can virtually guarantee that there's nothing wrong with the card itself in this build. My feeling is many of the complaints have the same in common of using lower end CPU's and boards, even RAM and expecting 300 FPS because the card cost about $1k.
My build is a X570 board with a 3600X/PBO enabled/10XScalar/motherboard voltage, 16GB DDR4 3733 custom tuned to 3600, 1800 IF, Gigabyte WindForce RX 6800 Gaming OC, Corsair RM750x, Corsair H110i. So far no issues with FPS and test scores are very nice. I say this to demonstrate parts matching parts to achieve a desired result, not to brag.
No problem, I understand and thank you for helping me. In deed Heaven bench is a good indicator for stability, that's why I used it, the problem is a stability problem. I sometimes have 150 fps and the second after 110.
I did a 3D time spy so you can see my results (but I can't upload it lol). My total score is 14 698, my graphic score is 18 020 and my CPU score is 7 189.
The problem could in deed be a CPU bottleneck, but my friends who bought the same card as me, has the same problem. But he has a ryzen 5 5600x and 32go 3600mhz.
And today I also played at Jedi the fallen order, and the same thing happens. But curiously, during cinematics, my 6800xt is used from 95% to 100%. And then when it's my time to play, it drops from 70% to 80%.
I have a few applications running in backward, but nothing insane (like discord/steam/epic games).
And I forgot to say it but it is really important, I play at 1440p, so the CPU should not be a problem at this resolution.
I did a lot of research to verify which GPU I should buy for a 1440p, and a lot of reviews prooved that my R5 3600 could definitely handel a 6800xt.
And I also set a minimum core clock speed, but the GPU usage doesn't move and still drops.