I’ve upgraded from a 5700xt to a 6900xt and Im not seeing any performance gains in warzone. I have roughly the same fps between 120-160 depending on what part of the map I’m on. is anyone else experiencing this issue?
ryzen 9 3900x cpu
rog crosshair viii formula motherboard
32gb ram 3600mhz
sapphire 6900xt gpu
radeon driver version 20.12.2
I have the same cpu & gpu and im having the same problem here, its not really getting over 160fps, no matter how low the settings are... im just wondering if you might have to overclock cpu and gpu to get to the 200fps mark?
did you find out anything?
Why is 200fps so important? Games are made to be playable at 30fps...and you can't get over 160? You want more...lower your settings from Ultra to High.
Because its just more enjoyable to have a smooth running game with a high frame rate and a high resolution (which im not even using) im playing at 1080p and turned of everything that would use additional VRAM, im just using like 5/16GB VRAM
Something is obviously not working how it should and if i wanted to play at 30-60fps i couldve bought a ps4 for 300$ or less and would not have to spend so much money for a graphics card that doesnt perform how it promised to do
I have replaced r5 3600 and upgraded to r5 5600x with my rx6800, reinstalled windows to support SAM (it needs uefi and sadly previously i had legacy), now , from pathetic 90-130 fps in average 110, I'm getting steady 120 with vsync and with no limits 130-180fps.
So sad how amd bottlenecked its own products
Today I switched from a Ryzen 7 3700X to a 5600X CPU. Was not sure if I rather had upgraded to a 5800X but after watching some Benchmarks between those two I decided to buy the 5600X. I must admit the difference is outstanding. In CoD Multiplayer I had an average of 170-210FPS with high settings , with the 5600X I now have FPS up to 300 FPS. Playing in high settings with 1440P. In warzone it was a struggle with my 3700X CPU (Average of 90-120FPS) and I started to doubt why my 6900XT could not get more FPS and with the new CPU I have constantly an average of 160-190 FPS. Can not believe that this "upgrade" would make such an difference.
With a 6900xt you should be able to hit 200 fps at high settings. Even turning it down to low doesn’t give more fps. It’s warzone itself the game isn’t optimized properly.
Yes you might be right there i was just confused because when it first came out, content creators ive been watching would get easily over 200fps and it didnt drop under 180fps but i just watched their recent gameplays and they are around the same fps with their nvidia cards and the same settings so it has to be the game and not the card itself
I just bought cyberpunk i might try and fps run there and see what the card really can do
Thanks tho for the help
Nah I haven’t found out anything. I think it’s warzone itself. With every season it seems like the fps is getting lower and lower. I see people using the 3090 and they don’t hit over 160 fps. Maybe if you’re around the dam or boneyard it gets up to like 180 for me but that’s about the highest it goes. I also play on 1080p
I have the same problem with rx6800
Funniest part is that on 1080p and 1440p it has THE SAME FRAMES, literally no change at all. In menu it shows 140-120fps, it's the same what was on my previous rtx 2060super which is at least 30%-40% slower than rx.
My friends RTX 2080 ti should perform quite similar to rx and even worse, so there should be around 200fps.
Worst part of all is that in warzone, during match I have WORSE performance than my previous card ! On RTX i was achieving stable 105-110fps and on RX i can see drops to 90 and with average 100-105fps. This is ridiculous!
Gpu is working correctly, in cyberpunk 1080p -80-100fps 1440p 60-70fps which is very predictable, same for Witcher 140-150fps, 90-110fps respectable
Yeah i noticed that aswell. The frames do not drop if i go on full settings the only thing i noticed is that my gpu gets slightly warmer, but not really to critical temperatures (around 60-70°C)
My card is providing between 120-160fps but its pretty inconsistent and the random drops are quite annoying
Yeah I have the 5800x cpu and 6800 xt and try to target 144 fps with 144hz monitor. But sometimes I get drops to 91,sometimes even for 80% of the game. Really annoying
The 5800X is a "no man's land" CPU, personally AMD should have skipped it for gaming purposes. The 5600X and 5900X (if you need a workstation as well) are the best choices for gaming. The 5950X will do well but the 5900x will beat it in games most of the time. you're 5800X can be tuned in BIOS, also match your IF to half your RAM speed manually not to exceed 1800Mhz. AMD claimed 1900-2000Mhz IF but most chips aren't going past 1800. Warzone is plagued with complaints anyway, so it might be the game.
You can try tuning the card manually so your clocks are within 100Mhz of each other (min/max GPU) and sliding the power limit to max so the card can use more power as needed. Turn off any zero RPM fan function and set a curve, this limits polling to the card for fan speed/temp. Undervolting for thermals is a good idea. find that lower number by switching to "automatic" mode in performance/tuning, "undervolt GPU" and write that number down, then switch to "manual" mode and allow enable all the disabled items, input the voltage you wrote down. Makes sure the power limit is at max after doing that step. The VRAM can be switched to "fast", why it's not set like that out of box, dunno. These are my settings for my RX6800, so it's only an example of your screen and what you should turn on/adjust per your card. GPU clocks on a 6900 I'd leave stock but move the min to within 100Mhz of max. Doesn't affect idle clock or temps out of game. My games ran smoother after doing that step, then I did the rest.
"The 5800X is a "no man's land" CPU, personally AMD should have skipped it for gaming purposes."
Not a unit of measure. Pointless text. No value added 0.8/10 post delete system32
Not pointless text. As an AMD insider and post reading many reviews and comparisons, the 5800X falls into "no man's land". It's cost is slightly less than the 5900x with less than 3% performance in games over the 5600X that is ~$150 less. Why would one buy this CPU that offers little if any gains between the lowest ~$400 5600X and the ~$450 5800X, when the impressive 5900X is ~$540 for huge gains over both? Because availability is poor. Or they may need 2 more cores for light workloads. Makes zero sense otherwise to buy the 5800X.
Driver reinstall worked for you? Great. Doesn't do it for the hundreds of others posting the same question. Anyway, that's enough of me answering a troll.
Yeah, exactly. i just changed the call of duty tuning settings to Automatic - and Default
with that, it works now perfectly fine without any drop.
you were right, its about the tuning measurements ! Thank you so much for your detailed answer !
So I was having this issue with warzone. Although I have the newer Ryzen 5 5600x, I was having low fps and crashes with the game.
I found this youtube video by this lady that talks about warzone issues with ryzen CPUs, and you need to go into as specific warzone file and change a setting related to video memory scale and "RenderWorkerCount"
Here is the Youtube video:
She explains the stuff around 4 minute mark.
Same here. It's not cpu related issue. It's purely driver side and game itself from what I have researched so far. And I'm sure there is something blocking game from rendering it properly as it's impossible that game has the same fps on 1080 and 1440, although both perform worse than lower tier gpus
It's definitely the game engine being designed with CPU intensive coding. The dev's need to step up to fix it. Older Nvidia cards tend to do better because of the calculations per second is higher, like in mining for Bitcoin.
I think these newer cards are purely meant for graphics and a different type of game engine altogether, since even the newest Nvidia's are not much if any better. CyberPunk is another CPU intensive game, which is why it seems to generally do poorly with FPS and rendering.
Other games that suffer are RUST, Red Dead Redemption 2, Apex Legends, Witcher 2, and anything written in C#, C++.
If you have an Intel-based motherboard try to update your PCIEx16 drivers - mines came directly from Windows Update for some reason. Also, make sure you disable freesync everywhere, and in global settings set Tesselation Mode to "let the applcation decide".
Here’s an update for you guys. I went out and bought a ryzen 9 5950x to replace my ryzen 9 3900x and now I’m getting over 200fps in warzone, and that’s on 1440p. 1080p would be higher I would assume. So with that being said. It’s definitely a CPU bottleneck issue. You have to upgrade your cpu to a 5000 series to take full advantage of the 6000 series GPUs
The 3900x is a horrible gaming CPU in general. So was the 3950x. according to reviews in the 3000 series the best gaming is really had at the 3600X, the gains going to the 3700X are minimal to none. Now the 5600x for pure gaming is rated the best while the 5900x is the best gaming/work combo CPU. The 5950x is good but the 5900x beat it in many gaming tests, however, if one needs more cores for work, than the 5950X is the best out. In any case, any of the 5000 series CPU's beat the bottleneck in games.
No bs. With the 3900x & 6900xt I was getting around 110-160 fps , with the 5950x I’m getting around 140-230 fps depending what part of the map I’m on. Settings are on competitive settings. Except for texture resolution I have it on high.
I have the same problem with me too
I'm running on
32gb ram 3600Mhz
and new 6900xt
on samsung 24 2k 144Hz
The game gets stuck non-stop and a course full of times and does not run smoothly at all!
Please urgent help how to fix this ??
I will note that I tried all the settings of sorts - on low on high on medium - all the same there is no smooth gaming experience
And I will also mention that I upgraded to this card from 2080 TI whose performance was very good for low settings would run perfectly smooth even if not on high FPS the gameplay felt much smoother - so for sure there is something wrong here