I'm having the same issue- a Radeon 6000 series GPU paired with a Ryzen processor performing vastly worse in gaming than it should. Specific symptoms are low GPU scores in TimeSpy (yet CPU scores are normal) with erratic mid-test GPU load, low FPS in games, and stuttering.
I honestly think it's a random hardware compatibility problem specifically between the CPU and GPU versus a CPU bottleneck or RAM issue. The chip I'm having this problem with is a 5900X running perfectly stable at 1:1 with 4 sticks of single-rank 3800Mhz G.Skill 14-16-16-36 RAM with XMP. Motherboard VRM is pretty well overkill for anything but heavy overclocking, cooler is a Noctua NH-D15 that keeps the chip below 40C at idle, and my PSU is a Seasonic Prime 850W. I've done some pretty extensive hardware, temperature, driver, and BIOS troubleshooting to rule out anything else.
With my 5600X the RX 6800XT will score in the low 18k range (GPU only, 15,500 combined) in TimeSpy. Destiny 2 will run smoothly at 1440p at highest settings between 120-144 FPS all day without anything ever breaking 70C. That's basically what I'd expect without really tweaking things.
Yet when I swapped in a 5900X fresh out of the box from AMD RMA, identical system and settings, my TimeSpy GPU score dumped to low 13k and Destiny 2 can't even hold steady at 70 FPS with medium settings- worse than the i7-4770K/GTX 1070 Ti setup I just retired a few months ago. Every other aspect of the system is stable, and the TimeSpy CPU score is perfectly reasonable for a non-overclocked 5900X (13,200ish).
Tried 3 different releases of Adrenalin drivers, two different BIOS versions, and tested with a different RAM kit, motherboard, and PSU with both chips. Doesn't matter- this seems to be independent, specific to my 5900X + 6800XT running together. Wish I had a spare 6000 series GPU to try, but don't we all...
I bounced this off the GPU's tech support and they were completely stumped- couldn't think of anything to try that I hadn't. Waiting now on AMD's reply to all this. I'll keep y'all updated on what I hear.
DO NOT set your PBO to Disabled unless you don't have an option to set it to advanced. If you can manage to set PBO to advanced then set EDC limits to 'Motherboard' That should solve issues for your CPU. That fix, solved every issue I had with my 5800X. As for my 6900 XT in Skyrim it has a GPU usage that bounces between 30-60% with constant fram dips to the 30's. Its playable, but very irritating. That is still an issue for me.
Ryzen 7 5800X (280mm AIO OC'd to 4.775 all core)
G.Skill 4x16GB (64GB total) @ 3600MHz
MSI B550 Gaming Carbon WiFi
XPG 1TB NVME SSD gen 3.0
Inland Premium 2TB NVME SSD gen 3.0
AMD Radeon RX 6900 XT
just be careful.. Warzone is a pain in the ass when it comes to AMD Gpus. dont make rash decisions that will not solve anything. Its more likely the game rather than your pc. My son has 5700xt and i have a 6900xt. it runs awaful on his pc with a 10700k. Im using a 3700x currently and its fine at 200fps set at high spec with a 6900
Falling in line with the last two comments here, I'm fairly mystified why anyone's initial advice to folks having massive performance issues while running perfectly capable processors with brand new graphics cards would be to overclock them, thereby both voiding warranties and adding additional complications to the end goal of just getting their darned games running smoothly. No CPU listed by anyone having issues here should be considered a "bottleneck" to playable FPS in any modern title at 1080p. Again, my 8 year old 4-core 4770k was more than capable of supporting a 1070 Ti at better FPS than I was seeing with my jacked 5900X/6800XT combo, and the OP reported better FPS with their last GPU as well.
Ditto with dropping XMP and messing around with RAM settings. I'm quite sure the people giving the advice know exactly what they're talking about and have seen solid benchmark gains tightening their timings, but if someone's system can't run a basic XMP at 3200 and/or FCLK doesn't automatically sync up at that speed, that's an issue to troubleshoot all on its own.
I love squeezing out that last few FPS to top out a benchmark score as much as anyone else, but I'm seeing a whole lotta pointing to the middle of the tweaking list here, not to the top of the troubleshooting tree where you might want to start, especially while still under warranty with RMA as an option if you discover something hardware related.
As to my own issue... gremlins. It was apparently goddamn gremlins. I was in the midst of building up my 5600X as a spare system to game with until AMD gets back to me, and remembered I still have my old SSD with Steam/TimeSpy/Destiny 2 installed. Tossed that in the 5900X rig and surprise surprise, my TimeSpy GPU score was back above 18k. Figured at that point it might just be as simple as reinstalling Steam et al. on my main NVMe, but that worked perfectly as well once back in. I could say it was reseating the GPU to get to the NMVe, but I'd already done that about 4 times already. Currently running a very reasonable 17.5k overall (https://www.3dmark.com/spy/19628330), with GPU usage a nice flat line at 99% like it should be. That CPU isn't running PBO, btw; PBO limits are disabled, -15 all-core undervolt curve, and max allowed frequency bumped up 200MHz.
Wish I could say I'd found the smoking gun for everyone...
For those wondering if the low GPU usage is the due to a card fault or any component bottleneck considering the results of AC Origins or Odyssey, dont worry, it is not. These two games only runs well in 4k resolution because in this resolution, the cards are forced to be pushed. For lower resolutions, due to a driver fault, any AMD card running these games, will show poor performance.
I'm attaching bellow, 3 images, showing the GPU usage comparison between AC Odyssey and Valhalla, in the first one, the average usage was 76%, against 98% of the second one.
AC Odyssey GPU Low Usage (Average 76%)
AC Odyssey GPU Low Usage (Average 76%)
AC Odyssey GPU Correct Usage (Average 98%)
Based on this data, it is clear that this issue is due to a driver fault, AMD knows it and does nothing (as always) So, don't worry about your systems, they're probably fine.
I have the same problems since I've built my new pc.
Some games running well (cs go, star wars fallen order, cod, horizon zero dawn), but some important games like destiny 2 or the new mass effect legendary edition are so bad. Doesnt matter of the resolution (1080p / 1440p), that changed nothing. The Utilization spread is Incredible, within 10 seconds from 5% to 80% to 40% etc. Fps drops all the time, not playable. Fps from 10 to 150 in destiny or from 10 to 240 in mass effect.
I dont know why the utilization could not hold over 90%.
All benchmarks are great. All common fixes I have tried. More hours of looking for a solution instead of enjoying gaming