*its a HD 8200 (not 8300).
I found some info:
1.Minecraft doesn't use the GPU for most of its calculations, instead relying solely on the CPU. This results in some pretty trash performance. But,
since these calculations are hard coded into the game, there really is no way to change them, and completely change the component that the game
is run on. Doing so would require them to rewrite the entire game
Minecraft is very CPU dependent game, but you need good GPU too.
Thanks Redfury. I am used to bad performance on Minecraft, but its the new game I am helping to work on that created problems. I am an AMD user, but the games programmers created it on Intel machines. At a certain point in the release chain the software of 'Vintage Story' went haywire, massive load on the CPU, which is what prompted me to dig deeper. Still no solve, but it has been established that other AMD/Radeon machines are affected too, so its not just me (phew!). Vintage Story is very similar to Minecraft, but capable of doing two or three times more with the same resources. I won't bother you further with questions about the Radeon GPU, because I would probably not understand the answers, and those I work with probably don't know which questions to ask yet. But thanks for the feedback.
1 of 1 people found this helpful
Your power plan must be Standard or High performance
1.I think it is some sort of mistake. Modern desktop (not browser) games cannot use software rendering.
2.Use HWinfo64>sensors to check your GPU activity during a game