Hi everyone from the community,
I am experiencing a terrible performance with Minecraft on my gaming pc. I used to have an rx580 and I now have a rx5700xt (both around same performance fyi) and I am getting almost similar performance as I had with an integrated intel graphics card on a laptop. I swing around 15-40 fps (lowest being 10-15 fps and peak being 40-50fps) and with my setup I should easily have a stable 150+ fps if not more. My CPU is a Ryzen 5 2600 and I have 16gigs of ddr4 ram. For example rocket league on max settings gives mean average of around 220-240 fps, ARK (which is still poorly optimized) even manages to run (sometimes) on a stable 60fps with medium settings, golf with your friends even has an average of 130-15 0 fps (and that game tends to cause issues for some).
Now I've heard that openGL and AMD GPU's generally don't go along that well? But this is ridiculous. Anyone know how I can increase my performance? There must be something because you don't get a 500+ euros GPU to not even be able to run Minecraft properly...
Things I tried:
- up and downgrading drivers
- dedicating more (or less) ram to Minecraft
- tinker with the in-game settings (although I should be able to run on high/max settings, still)
- check temperatures and usage (which are low, as expected)
-using Optifine (which helped of course, but the performance I mentioned is already with Optifine)
If anyone can either get this to the right person or help me, I would greatly appreciate it! Because this is just dirt poor performance and it should be ashamed of itself.
I get that, but my CPU (and GPU) usage is very low, I don't even have the settings on max (render distance had to be lowered or else it was unplayable). But how can this setup perform almost as bad (sometimes even as bad) as an old laptop with Pentium and integrated graphics + 4 gigs of total ram? I read all over the place that a lot of people with an AMD GPU seem to have this problem, however I've never found the answer.
And let's be real, if a world is loading or a big structure or whatever it's bound to hiccup, I get that. Constantly running at 30-45fps (with sometimes hitting periods of 15-ish fps) is just not okay. For comparison reasons, my friend has a laptop (a few years old, at least 4 years) with an NVidia GPU and that thing runs Minecraft way better. Worse CPU, worse GPU but it's an NVidia GPU.
I stay to my point that this system should be able to run Minecraft at a way better framerate at high/max settings, both the CPU and GPU (and ram as well). There's just something with OpenGL and AMD (which is widely known) but there must be some solution or a workaround. Even my old laptop (2012) with an AMD card performed better, so maybe newer GPU's are affected more by this issue idk...