Showing results for 
Search instead for 
Did you mean: 

Graphics Cards


Problems with my GPU performance


I could use some advice regarding my 6700 XT (Sapphire Nitro+).

I seem to be having issues with some games where I'm not getting as high FPS as I'd expect. In newer/more demanding games it's working fine but in older/less demanding titles the GPU just refuses to put in the effort. For example, now I tried playing Yakuza 0 and I expected it to be able to run at 144 FPS no problem and it does, except when I get to more demanding areas and the FPS drops to 120-130 while the frequency doesn't really go up and the GPU usage sits at 50-60%. The power usage also stays low (50-80W). I had almost the same framerate with my RX 480 before. If I increase the load by turning on SSAA in game settings then it starts putting in the effort but then it can't maintain 144 FPS because of the higher demand.

Similar thing happened earlier when I played Vampire The Masquerade: Bloodlines. I modded it with visual mods like RTGI and my GPU was running it fine in low effort mode until I got to areas where boosting up would be required and it just refuses to do so, simply lets the FPS dip even though I know it should easily keep it high.

I tried reinstalling drivers with DDU and even reinstalling windows but nothing helped.

My specs:
Windows 10
RX 6700 XT (Sapphire Nitro+)
1440p monitor, 144 Hz (main); 1080p, 60 Hz second monitor
Ryzen 7 3700X
16 GB RAM (3200)
Motherboard: ASRock B550 Phantom Gaming 4 AC (latest BIOS)
Radeon software version 21.10.2

13 Replies

I haven't played those games, it looks like those games may be using CPU over GPU..

Yakuza 0 for example apparently is badly coded so makes the CPU run at a very high rate or 100%..

I had a quick search online and there are some people saying they got a fix for it..

Have a search for 'Yakuza 0 high CPU usage', you may find something..

AMD Ryzen 7 3700x, Asus Tuf Gaming RX 6700 XT, Asus TUF Gaming x570 Plus, 32gb G.Skill TZ neo 3600mhz, Samsung 980pro 1tb NVME, Samsung 970 EVO Plus 1tb NVME, Lian Li Galahad AIO 240mm, Antec Titanium 1kw.

Thanks for your reply!

I am aware of the Yakuza 0 CPU issues and I had them until I upgraded to 3700X, right now my CPU usage is at around 20% in game. Before upgrading my GPU I was using RX 480 (8GB) and I was getting 100-120 FPS while GPU was working its butt off, now with a 6700 XT it just refuses to put in the effort and maintain 144 FPS, simply lets it droop down to like 120-130 in a couple of specific parts of the city that are more intensive.

I tested it out today some more and the same exact thing is happening in Yakuza Kiwami and Dishonored. In Dishonored the game just runs at stable 130 FPS while the GPU is clocking at around 700 MHz but when I get to a much more intensive area the FPS just drops to around 110 but the GPU isn't even trying to keep the framerate up.

Also, in Song Of Horror the game runs at stable 60 FPS with GPU not even doing anything until I get to Episode 4 which is very poorly optimized and the framerate just drops to around 40. The GPU should definitely be able to keep it running at 60 but it just refuses to.

It's annoying that I can't figure out if it's an issue for the drivers in general, if the 6000 series just doesn't like older games or if it's a thing for me specifically since all benchmark and performance videos are testing newer games and those are running as expected for me. Perhaps it's a thing for games using older DX versions or something, but the common thing seems to be low power usage (around 50W) and low frequency. Games like Resident Evil Village, Death Stranding, Doom (2016) and Control are using proper amounts of power and the clock speeds are correct too.



I had a couple of GPUs in use over the last few months..  All the AMD card , 6700 and 6800 had the same issue in many games, especially in 1080p . 

Low usage down to 50W and low clockspeed down to 500 .. only thing that helped was to raise Resolution or scale the Resolution up ingame. Warzone, Battlefield V, are cpu demanding but it doenst justify that the RX 6000 series go in a sleep like mode ...

I have not found a fix to this problem yet and setting a minimum clockrate in the driver menu ist not an option.

I did some more digging and it seems like the 6000 series might have a problem with games using DirectX version older than 12. Did you by any chance try testing the GPUs in Battlefield V in both DX11 and DX12 modes eLni? I wonder if the DX12 mode might not have this issue. Unfortunately I don't play that game and I don't know which of my games might have both DX11 and DX12 versions so I don't know how to test it myself.

I'd really love to get to the bottom of this. I of course reported the bug to AMD but I doubt anything will come of it

UPDATE: Rise of The Tomb Raider has both modes and I ran benchmarks but it didn't really tell me anything. I did get a couple of extra frames in DX12 mode but even in DX11 mode the GPU behaved like expected and worked well...




There are a number of things to consider even if a game uses the same backend engine which a few things you can adjust for like tessellation, "hair works", draw distances, anti-aliasing and post processing to consider. Not to exclude rendered resolution, display size and color depth. And at what point do we throw in effects like wind in the grass, bushes and trees. Are there static or dynamic events that will occur, such as the possible encounters with "ai/npc's" and how they are set react to situations. If your in a first person shooter, did you consider bullet drop, effects of wind, loss of velocity and how these effect the damage to what a round strikes? Many people can only attempt to comprehend only what they can see. For example one can go back to Battlefield 3 where a laptop user in a helicopter can only perceive what they see in their small display, while someone with a larger display could see across the field and perceive aspect changes of an helicopter and adjust their fire by leading to where it might be. Thus they might be thinking that the one firing on them is using an aimbot. To which in that example they are limited by display size and draw capability on their resolution. Another is if one turns off shadows to be able to spot a person under a tree, or disabling terrain effects so the grass isn't visible thus spotting the person being prone. Each of these things depending on settings effect the overall visual experience, but it effects how much of a load is on a GPU. While all of this is going on there's everything else that's going on in the game that's not perceived. Over time hardware has evolved, and so has game engines. One of the things that has changed that even I have to consider is that these processors have been programmed to operate with in certain parameters. i.e. firmware, drivers Now if a person has over time used Heaven 4.0 Benchmark to record their base out of the box scores and temperatures they will notice who they differ over time with each generation of GPU. A newer card will more than likely use less of it's own resources and have lower temperatures than the previous generation GPU.  This applies to CPU's as well.  Let us throw ArmA 3 under the bus by going back to about nine years ago and we can see how many complained bow "badly optimized" then return to present day and look at it again with todays hardware. So I close with this thought, and I do not claim to work for a game studio or do code yet I consider this...

"these processors have been programmed to operate with in certain parameters. i.e. firmware, drivers"

X570 Taichi, Ryzen 3600XT, 32G of F4-3400C16D-16GSXW, XFX RX5700, NH-D15, Fractal Design R5, SilentWings 3 case fans, Intel 660P M.2 1T, WDS256G1X0C M.2 256G
Journeyman III

I have reference rx 6900 xt and i have similar performance issues. I bought this last may and at first it worked very well whitout hiccups. I use it for gaming and mining.

1,5 month ago something happened and it started to have similar issues. Power draw is between 80 to 120 watts and games and mining suffers from that. Power limit setting is wonky. It may help or it may make things worse. Benchmarking is fine and i get good results, like 23600 graphic points in TS with MPT and watercool.

Measures which i tried to solve this:

- Reinstall windows

- New psu

- Disabling all DS settings from MPT

- Reinstalling drivers multiple times with DDU(sometimes works day or two)

- Registry edititing and trying to turn of all sleep state settings

- Trying to reinstall directx 12 and checking if its on

- Making sure that Windows Game mode is off and computer is on Performance mode

- Used countles hours just testing and searching information from net.

Now i have been trying older drivers and today im back in 21.2.2. This feels good but lets see does it really help. Next what i have been planning is to try new motherboard. This problem is very tilting... Please Amd, look into this problem so that maybe one day new driver release will fix this. It is sad that those new drivers are useless bec this issue. New nice features and i can't use them because this powerdraw/core clock problem.


Ryzen 7 3700x

Asus Strix B550-i itx

Micron Ballistix elite 3800mhz

Corsair SF750

WD sn750 1tb and Samsung 980 1tb

Custom waterloop


I's love to see an update regarding older driver version if it helped. This behaviour seems almost intentional and I'm wondering if it's a driver thing meant to gimp mining or something like a power saving mode to not strain the GPU by not letting it run at like 500 FPS in old games.


Well it feels that these 2.2.21 drivers are best performing. There is still weird power draw behaviour wich i notice when mining. I try to keep powerdraw in 140w, then it gives best hashrate. It can drop 10w but not anymore sudden drops to 80w and only fix to reinstall drivers. And now if it drops 10w, i can adjust powerlimiter like plus 10% and then its back to normal. Adjusting powerlimit slider - or + i cand find again the powerdraw what i need. But when there is this weird powerdraw behaviour, that powerdraw limit slider doesn't work logically. Just have to test does - or+ percentages correct powerdraw.

I tested games and Subnautica Below zero performance in 4k and high settings was good. Power draw was 260w. In Age of Empires 2 De Power draw was 60w-80w but the game itself is not demanding. AoE2 was still playable. Some very minor stuttering. With newer drivers sometimes Age of empires 2 was unplayable and only fix was to reinstal drivers and then it worked maybe day or two. 

I need to test MS flight simulator with these old drivers. With never drivers it was also sometimes unplayable because low powerdraw/clocks.

I think there is somekind driver issues because drivers can effect the seriousness of this problem. Maybe bad drivers and DirectX together makes bad combination...

Journeyman III

Here are some early warning signs of video card failure. Stuttering: When a graphics card starts going bad, you might see visual stuttering/freezing on the screen. ... Strange artifacts: Similar to screen glitches, a bad graphics card can result in strange artifacts all over your screen.


Not quite sure what that has to do with my post... None of the things you listed are happening.

Adept III

I also have this problem with the 6900xt GPU

maybe this link can help you

My problem is not solved yet

Seems like it really might be DX11 related (or DX versions older than 12 I guess). Kinda shame, I do love me some old games. But I guess they run well, it's just that this lower than expected performance got me worried my GPU might not be working properly but I guess that's just how it is with 6000 series. I do hope AMD might fix this in later drivers but I doubt it...



I play warzone, although it is DX12, the game uses 70% of my GPU, but in the game metro last light, which was a few years ago, 99% of my GPU is used.

It seems that when we are in detailed scenes, the graphics card becomes lazy