So I just stumbled upon an issue, which I'm not sure if it's a bug or a feature of AMD driver. I recently decided to replay Wolfenstein New Order, which runs on ID Tech 5 and subsequently uses OpenGL for rendering rather than DX. Turns out AMD can not show in-game overlay in this game. I've tried a few other games that use OpenGL and all of them don't show this overlay.
After extensive search I could not find any clear information about AMD Radeon support for OpenGL and specifically if overlay and relive are suppose to work or not. Can someone confirm that this is AMD's problem and expected behavior or is it something wrong with my particular setup?
PS: Tried with 21.4.1, 21.5.1, 21.5.2 driver versions. Win 10 Pro, AMD RX580.
Solved! Go to Solution.
Hi @uladz
AMD Radeon do not support their overlay through OpenGL, from a end-user perspective their driver is just enough to get an OpenGL game rendering graphics close to properly, I have had issues in the past with RAGE (also ID Tech 5) not rendering shadows properly, but the team did a great job in fixing it a few months after I reported it.
I hope for your part you have a newish (at least 3rd gen intel i5/i7) CPU for these games because they do not play well for me when having any AMD CPU paired with an AMD GPU, but a Radeon does work quite well with say an i5 4670k in these games.
I have reported performance issues in this game a lot, but the overlay not even working in OpenGL says a lot. I do recommend enabling "Wait for Vertical Refresh" and "OpenGL Triple Buffering" in the Radeon Software for this game's profile. Furthermore, Chill / Enhanced Sync / FRTC also does not work in OpenGL, but I wrote a post of how to have 4K working with OpenGL on Radeon.
I remember somewhere that relive works in the background with OpenGL probably with the Hot Key, but the notification wont pop-up.
Also note, 4K CPU bottlenecks this game hard, because ID Tech 5 does Image Sharpening on the CPU and not GPU with a setting only tweakable in RAGE called "Texture Detail" which seems default in the Wolfenstein games.
Kind regards
Hi @uladz
AMD Radeon do not support their overlay through OpenGL, from a end-user perspective their driver is just enough to get an OpenGL game rendering graphics close to properly, I have had issues in the past with RAGE (also ID Tech 5) not rendering shadows properly, but the team did a great job in fixing it a few months after I reported it.
I hope for your part you have a newish (at least 3rd gen intel i5/i7) CPU for these games because they do not play well for me when having any AMD CPU paired with an AMD GPU, but a Radeon does work quite well with say an i5 4670k in these games.
I have reported performance issues in this game a lot, but the overlay not even working in OpenGL says a lot. I do recommend enabling "Wait for Vertical Refresh" and "OpenGL Triple Buffering" in the Radeon Software for this game's profile. Furthermore, Chill / Enhanced Sync / FRTC also does not work in OpenGL, but I wrote a post of how to have 4K working with OpenGL on Radeon.
I remember somewhere that relive works in the background with OpenGL probably with the Hot Key, but the notification wont pop-up.
Also note, 4K CPU bottlenecks this game hard, because ID Tech 5 does Image Sharpening on the CPU and not GPU with a setting only tweakable in RAGE called "Texture Detail" which seems default in the Wolfenstein games.
Kind regards
Thank you for the detailed explanation and tips. Yes, I'm trying to run it with triple buffering enabled and forcing v-sync from AMD control panel. AMD's OpenGL implementation is just as bad as it gets, the game can't even hit 60 FPS while the GPU is underutilized and running around 60% load only. I've tried different "hacks" to make it work, interestingly enough disabling CatalystAI manually by editing registry still make a difference, I thought it's not used any more in the latest Adrenaline drivers but disabling add 5-10 FPS in my case. I found a few other bug in AMD software as well, which i"ll be reporting shortly. Seems like NVIDIA has none of these issues with OpenGL, my older NVIDIA card runs this game 60 FPS constant no sweat. I hope AMD will get their game together and fix it otherwise my next card will be NVIDIA.
Hi @uladz
Wow thank you for marking my comment as a solution, although I would rather hope the solution coming from AMD.
I have been struggling with this performance issue for quite a few years and I have read about disabling CatalystAI, although I have never attempted in doing it, because I would guess it can cause stability issues with auto detection and optimization for other things in Radeon Software.
What I can suggest you try doing instead is to use ID5 Tweaker, it gave me quite a performance boost and doesn't disable achievements as far as I know, since it only forces some of the IDTech5 engine commands and CVARS to change. Here is the link, https://community.pcgamingwiki.com/files/file/849-id5-tweaker/
Basically a bottleneck in the ID Tech 5 engine has to do with its' MegaTexture decompression and streaming and largely related to a setting/variable called "vt_maxPPF", this tells the engine how many texture pages it is allowed to decode/process per frame processed. Furthermore, when not having a Nvidia GPU the decoding is done on the CPU, in conjunction with the Texture Detail sharpening on the CPU as well and the Wolfenstein games seem less efficient with this compared to RAGE. Reducing the maxPPF to 4 in Wolfenstein as the tweaker does reduces load on the CPU with a small chance of textures popping in late, but since you are running at 60FPS, it means 60 x 4 = 240 texture pages are processed a second which seems to be enough in most cases.
In addition, another important thing to note is not to have the MSI Afterburner OSD running with these particular OpenGL games, since a lot of times it results in VSync being locked to 30FPS, but you can have it running in the background and have it send utilization percentages to your phone with the MSI Afterburner APP.
I have also found Nvidia GPU's performing better in the past, but this is largely due to the game implementing adaptive VSync since Nvidia supports it through OpenGL extensions whereas AMD Radeon do not. The constant is called "Swap Tear" or something like that in the engine, also the momentary tearing distracts you from the animation stutter, which is apparent if you have FreeSync or GSync enabled since this game needs a constant 60FPS to have smooth in-game animations since they are linked to the Frame Rate.
Might I ask what CPU you are using and what Nvidia you had? I think any Intel newer than i5 3570k should perform quite well with a Radeon in this game. I have equally poor performance on an FX 8350 and R5 1600 in this game, but testing with an i5 8400 resolved the issue for me as far as I could remember, although I had to disable the Integrated graphics in the BIOS.
Kind regards
Thank you for the tips, it's rather very long reply with a lot of information so it took me a good few days to digest and gather together my thought to reply. So let's go question y answer :).
1) ID5 Tweaker. Yes I have tried it already and opted out of using it because I gave me no advantages and everything I needed/wanted to check in configuration I was able to do through command line and game's config file. The only things that would allow me to push the limits is min PPF to 4 and max FPS > 60, but I did not need both. I have noticed no difference in performance with PPF 4 but I have started to see how textures are loading with higher res, not a pop-in effect though. Framerates did not improve at 4 or 8 so I kept it at 16. And I did not need more than 60 fps because the engine can barely keep up with 60 anyway.
2) Megatextures. Oh don't get me started, interesting idea with epically bad implementation. If texture decoding is done on CPU when using AMD GPU then it's not the issue in my case. My CPU load is relatively low at around 30% max I think and no spikes, so I don't think this is a bottleneck in this case. I have even tried to completely disable all postprocessing, i.e. sharping, upscale, etc, and it made absolutely 0 difference FPS-wise. Bottomline, CPU is not the one to blame.
3) Vsync. Well, what I do is force vsync to always and enable triple buffering in AMD driver and disable Vsync in the game itself. This gives me almost the same framerate as if I disable vsync altogether. So, yes disabling vsync might give you a few extra FPS but not worth it due to screen tearing. I have to note that even with disable vsync, AND GPU load is hovering around 70%, it's criminally underutilized. I would love to use adaptive vsync if it would work in OpenGL with AMD GPU, bu it does not. So we are getting back to the main point that AMD's OpenGL implementation is the problem - nothing works there and it's slow as unoptimized as hell.
4) CPU/GPU. I'm running on rather old system with i7-3770K and RX 580 BE 8GB, I used to have GTX 680 before and Wolfenstein first playthrough was a blast 60 FPS constant all the time. That's why I was shocked that RX 580 can't handle it.
Thank you again, I'm really enjoying this conversation!
Hi @uladz
Excuse me for only replying now, I didn't realize there was a new comment.
I agree with you that it is mostly not a CPU bottleneck, I am experiencing exactly the same issue, sorry for writing the long post.
That i7 3-7-7-0 should have no problem with any of the Wolfenstein games even at stock speeds due to its' good IPC. At least I can tell you that your performance on Wolfenstein II The New Colossus will be great, since it utilizes Vulkan properly just like Doom 2016 does.
I get 100FPS+ all the time at Uber settings in Wolfenstein II with my RX 480 & FX 8350, but I agree that doesn't excuse Radeon OpenGL performance at all, the same goes for their DX11 performance that still doesn't seem to support DX11 multithreaded rendering properly.
At least your i7, 3-7-7-0 should brute force through most DX11 Radeon issues and I would've expected it to do in the older Wolfenstein's as well. I can't remember if you have a SSD, but another thing that might help is having your Wolfenstein on a SSD and to create the texture cache folder for the game in the "local app data folder".
What I wish I knew is why Battlefield 4 and Alien vs Predator runs insanely well on my system in DX11 but other games such as Far Cry 3 - 5 don't.
Sorry I had to type "3-7-7-0" because this site is changing my wording to "Spam", no jokes. Absolutely ridiculous, I am going to try adding k after the number, so here is the original -> i7 sPaM and here is the 'k' -> i7 3770k
Kind regards