Any thoughts of guidance on this would be helpful.
On my old i5-2500k/GTX970 system my frames were consistently high and fluid, often running above 60 in the world on near max settings at 1440p/144hz. I was expecting my new AMD Ryzen 5 1600X/Vega56 system to perform much better than it is, and it's performing strangely.
I did move up to a 4k/60 monitor, so I know that is more taxing. But, I'm hitting say 17-20 fps in Verdant Brink, and no matter what I do the frames don't really change. I can supersample, I can try to run the game at 1440p or 1080p but my frames remain the same. That doesn't seem right, yeah? Anyone experiencing anything similar? Silverwastes are also bad. Divinity's reach is maybe 20-30.
My R5 is OCed to 4.0ghz locked, and my Vega56 is also overclocked (2.5% shader freq, +50% power limit, 975 mem freq, 1050/1075mv) , though running at stock clocks made no difference. When I play GPU activity is at 99% according to Wattman.
I'm in a custom water loop, so all temps are good (CPU not over 70c, GPU not over 60c).
The card runs as expected in single player DX9 games, but GW2 just doesn't seem to be getting the performance I'd expect. And I'm speaking about just regular world traversal, not even big events with lots of people which will always drop frames.
Current in game settings have reflections off, shadows high, and player number/quality set to low.
Anyone experiencing something similar, or can try and compare? This just doesn't seem to be working right to me. My frames should change at least if I change resolution, right?