Ok. Most of the time when I was playing lately it was cyberpunk. Yet, game finished and it's time to play something else. I have noticed really poor performance and gpu/cpu utilization while gaming.
In NFS2016 which is old in todays standard, I'm barely achieving average 100fps (80-120) in max + 1440p. I can see gpu utilization between 80-110W 40-70% which is very low compared to Battlefront or cyberpunk 130-160W and constantly 95-99% Heck, in 4k it's achieving 65-90fps with max settings and I prefer to play in 4k with tv automotion that increases frames to 120 rather than 1440 with 80-90 fps.
I have tried CRYSIS3 which had sync problems and wasnt going above 64fps (popular bug) with barely 30-40% utilization.
Then I was trying GTA5 and AGAIN 1440p barely playable with 60-90 fps.
WHAT IT GOING ON WITH RX6000 SERIES ! AMD ! With my previous gpu I had better results in those games !
Cyberpunk, WItcher and Battlerfont are performing as expected +/- compared to youtube benchmarks. THere is clearly something going on under the hood with drivers. Windows reinstalled, ddu, updates to newst drivers, countless hours spent on trying to fix that rig. Hands down... I'm using higher resolutions intentionaly to avoid CPU bottleneck, still, it wasn't a problem with team green gpus.
Also noticed that when I'm achieving poor results it's always coming with low cpu gpu utilization - cpu 30% gpu 60-80%
Errm, higher resolution doesn't bypass the CPU bottleneck and can make it worse. The CPU needs to draw the same just more pixels and smaller. So not to blow your theory, but it fails there. I assume you know that anything that runs the "Cry" engine, including FarCry series has the same CPU bottleneck. NFS 2016 is CPU intensive.
If you watch some comparison tests on YouTube with said card and your 3600 CPU vs. the 3700X, 5600X, 5900X, I happened to watch one last night by the "SpyClub" or something, they use a "Joker" face as an avatar. It showed the FPS in 1080p and 1440p for those CPU's and all using an RX 6800 XT and I think you'll see results that show this bottleneck. I have a non-XT RX 6800 and a 3600X CPU, so I know exactly what you mean.
I use 1080p and FarCry 5/New Dawn are as bad as it gets with running the games on Ultra with textures at "2", FPS is ~130-90. Still better than my former RX 5600 XT that did ~120-60 FPS and had some micro stutter. Racing games seem real good with NFS Shift clocking ~200-265 FPS Ultra, Assetto Corsa is off the chart, World Of Cars a smooth ~160-200+, BF4 ~175-200. so the card works and well. It's the games and the fact that these newer cards starting with the 5000 series don't compute like the older one's did. Even the 3000 series Nvidia's aren't so hot with these games unless paired with an Intel that has a better single core performance. If you had a 10 or 20 series Nvidia you would be correct in that the games had higher FPS. The "fix" for you, because I'm happy with the way mine runs for now at 1080p, is to go up to the 5600X or higher. For gaming it's might be a waste to go above the 5800X and that's for a possible 5-7 FPS gain over the 5600X/~20-40 for the 5900X.
It took me 7 months to get this 3600x to finally behave, so I'm just not keen on moving to the 5600X, which would be my next choice. Bang for the buck, it's the best for gaming, if it works with my RAM, without going through hours of re-tuning everything and reinstalling the OS, games, etc. Very tempting at the $400 it's going for, but it nets me ~10-30 FPS in most titles I play now. So I need to ask myself, is it worth the hassle/money?
Drivers aren't going to fix the card's design. They made it so one will very much or very little want to upgrade to a 5000 series CPU. At the slight risk of losing business to Intel/Nvidia, a gamble they will win for the most part as most can't afford to change our entire platform. Take this RX 6800 and pair it with the 10900K and it will be as disappointing, a little less but still not as intended due to no SAM function, raytracing not compatible type of thing.
Do I regret shelling out $929 with the tax and shipping? Nope. Because prices aren't going down for a long time and my performance is quite good overall. At worst I'll fall for the 5600X or maybe the 5900X, if I go crazy. Another OP on here went from poor FPS using a 3900X and a RX 6800 XT to the 5900X and it cured his entire issue with every game. Neat to know, my FPS was better than his with my combo. So the 3900X isn't the best gaming CPU either. The 3600X pretty much rules the 3000 series for games, the "XT" is somewhat better.
3x worse gpu with newer game with the same fps....
2x worse gpu, same settings, 20-40fps higher.
ancient age gpu, only 20-30fps lower
2x worse gpu same brand, same results
comparable specs - higher fps, higher utilization.
I'll repeat myself since an OP marked me as a "solution". OP had a 3800X, same problem with lower than expected FPS. OP bought a 5800X, fixed 90% of all the issues. You're comparing older Nvidia cards to an AMD card made specifically to work with a 5000 series CPU.
Perhaps AMD should add that into the spec so people stop having these high expectations of slapping a $1k GPU in with a R5 3600 and expecting these super FPS rates. You're missing a few things, memory bandwidth I believe you have less than a B550 chipset? A 5000 series CPU for best results as the card works with the L3 cache and System Access Memory (some games). You could try running in DX 11 mode or turn some settings down like bloom, reflections, rear view mirror, motion blur, and car detail.
Although I have a R5 3600X, I'm not having these issues to a point of complaining with my RX6800. There's more to it like RAM. RAM speed/timings, IF settings, PBO on /off, chipset, brand of GPU.
Posting endlessly in here about Nvidia comparison tests is not going to fix the hardware compatibility issues. AMD designed the card, at least the reference board, and it's made to work properly with a B550 or higher chipset and a Ryzen 5000 series CPU. The R5 5600X is probably the best bang for the buck to pair up with a 6000 series card for purely gaming.
If you're that unhappy, sell it and switch. Don't know what else to tell you. It's hardware design not a driver so it's not getting fixed unless they change how they build cards.
Dude it's not the "poor" 3600. Drop the drama, you're expecting way too much. I'm sure I'm leaving a lot on the table by running the 3600X. At 1080p, with the games I play, it's good enough. I do think they way overplayed how people should go with the 3600 non-X because it can match the X, it can't. Just like mine can't match the XT version, which is very close to the 5600X, so close I considered it. But they cost the same right now and IF I upgraded to a 5000 series the 5600X makes the most sense at ~$369-379 maybe slightly less.
I researched getting the 5900X or 5800X instead and the 5900X is the "best" for the most FPS out of the 3 but the 5800X is only ~$100 less and performs ~5 FPS better than the 5600X at stock or OC speeds. The 5600X is a whopping ~$180 less than the 5800X too.
The point is I don't have tons of money to toss out for CPU's, GPU's, etc. It was a fluke I was able to squeeze the $972 I paid for this RX 6800. It was ~$150 over MSRP before tax and shipping. Seller paid $760 or so in December and never opened it. November the card was $679. So if I decide to go to a 5000 series I'm sticking to the 5600X at the sub $400 range for sure.
Would it be nice for AMD to build GPU's that didn't really need to match a CPU or chipset to work right? Absolutely. They could sell more and compete with Nvidia much better. I agree it's ridiculous to need to spend an additional ~$360-400 for a CPU to make the ~$970 GPU work. It's like buying a new PC.
You don't get that in some countries, like mine, those cpu cost as much as whole monthly salary of average person (and no, it's not what google shows - it's not 610Euro but 380, dunno where did they get those numbers). It's not 1/4 or so, like in other countries like germany, uk or other more civilized ones. Here in Poland, 5600x cost more than minimum wage, not to mention higher tiers. I can call myself good earning person here with what would earn street cleaning person in Germany, yet it's not easy to squeeze budget just for gaming. I've checked a lot of videos before buying it and most of the claims were clear - 3600 is more than enough to run it comfortably. We can thank to people making those claims. ANd sorry for this chaotic post, i'm after 12h in job writing it around 1am
Perspective, I'm on disability. In the US that amounts to roughly $1k /month. Out of that I need to spend $650 for rent, then electric, internet, cell phone, oil, food, feed my dog, etc. Even with my mother's even less income, I fully understand "low income". The Euro is equal to $1.21 USD today, just for reference.
In your position, you would be better served unloading that RX 6800 and switching teams to get better performance. I say this only because you're cash strapped in Poland, where getting anything PC part wise can be tough or really pricey. Honestly in your case, a used 1080 would work better. Anything AMD is just going to hold you back being stuck with a 3600.
It's not you fault that people misled you into thinking just the card was going to save the day. Some games it should. Realistically it should do as good as the RX 5700 XT but with more VRAM.
Thank WW2 for Poland being held back compared to other European countries. Germany is just starting to recover but they had a lot and still butt kissing to do. UK is another story, that's like the USA on the other side of the world economically. Also, the average American household doesn't make $50k/year. There's no "middle class" anymore, just the "super rich" the "haves" and "have nots". Covid made it all worse because now we have all those that "had" in the food lines, never knowing poverty. Google is wrong about a lot of things.
Looking at that, we both have a lot to be thankful for. Who has an RX 6800 anything in Poland making what you claim? You, that's about it. Be proud.
You are not completely correct about CPU bottleneck increasing at higher resolutions, this is a wrong idea too many people are spreading on the Internet and also gamer reviews. @BrainsCollector actually had the right idea if he was trying to ballance frame times more on the GPU side.
I laughed the other day when a quite popular reviewer, I think his name is "Timmy Joe" or something like that, was confused when he overclock an FX 8350 with a Vega 7 in Far Cry 5 and it didn't get lower minimum's on 4K than on 1080p, and he was like "wow, I can maybe prefer this over some older i5's".
The reason is, in most cases you program the CPU to send Vectors co-ordinates of polygons and tell the GPU how to fill in the pixels between, so in essence the CPU isn't too concerned with pixels, because it offloads that job to fill in the pixels between for the GPU.
Cases where the CPU can bottleneck is with texture decompression, sending draw calls to the GPU and then related to this the amount of polygons it has to send to the GPU, so I am quite certain that high tessellation settings will have an effect on the CPU as well. Rage has an option "texture detail" that does the texture sharpening on the CPU which is quite silly and this can bring quite strong CPU's such as FX 8350, i7 4770 to 100% usage and be impacted by higher resolutions, but testing at 1080p and 4K without this texture detail results in the same CPU performance.
Furthermore, Field of View and therefore also aspect ratio I believe will have an effect on performance, and 1080p and 4k are both 16:9, therefore the same amount of polygons will by sent to the GPU to fill in the pixels between, where culling might be an issue depending if it is software or hardware culling.