When playing Wolfenstein: The Old Blood, I experience heavy tearing when looking around, no matter if I enable or disable FreeSync in CCC, and even though the game runs at 60 fps most of the time.
When I enable vsync in-game, the tearing dissapears, but I thought FreeSync was meant to avoid tearing without the disadvantages of "traditional" vsync. Any ideas what I could do about this?
Each FreeSync display has its own effective range of frame rates. If you go above or below the effective range, then it reverts to normal behavior, which means tearing. If you enable vsync, then going above or below reverts to normal vsync behavior.
For example, let's assume your display has an effective FreeSync range of 40Hz to 60Hz. This means if your frame rate goes above 60fps, you're going to get tearing, unless you turn on vsync, which with double buffered vsync means your GPU will sit idle once it has rendered the next frame, until the monitor refreshes and the buffer swap gives the GPU a new frame buffer to render to. With triple buffered vsync, it will have two buffers to fill before going idle.
If your frame rate goes below 40fps, you'll once again get tearing without vsync, because some of your screen refreshes are going to be split between one frame and the next. With vsync and double buffering, the frame rate will drop until the required frame time fits into a multiple of the screen's refresh interval (16.67ms). For a 60Hz display, the sequence is 60, 30, 20, 15, 12, 10 and so on into even more unplayable territory. That means with double buffering and vsync enabled, you'll drop straight from 40fps to 30fps. With triple buffering, the GPU doesn't have to start over from scratch at each screen draw, so the frame rate can be any value from 1 to 40. It will simply repeat the previous frame for each refresh cycle that comes up while the GPU hasn't delivered a new completed frame. The result can be a fair bit of judder, but no tearing, as each frame drawn to the screen is complete.
If your problem is going above 60fps, not below, then just enabling vsync without triple buffering is what you want. Another alternative is to set a frame rate limit (via either FRTC or a third party tool like RadeonPro) so that you never go above 60fps. I think the results with vsync would probably be better.
If your problem is going below 40fps (or whatever the lower FreeSync limit is), then vsync is your only viable solution. Without triple buffering, you'd be going straight to 30fps, and assuming your frame time remained below two refresh intervals (33.33ms), it'd be a steady 30fps until the scene allowed rendering more than 40fps, putting you back into FreeSync territory. With triple buffering, you'd have an arbitrary rate from 40fps down to slideshow, but the uneven distribution of new frames would likely produce a judder effect while panning the virtual camera. It's down to preference which situation you'd prefer. The only other option would be to lower your detail settings to prevent dropping below 40fps at all.
Some of that advice would change a bit if your actual display isn't effective from 40Hz to 60Hz. For example, a 144Hz display that's effective from 40Hz to 144Hz would experience much less judder with triple buffered vsync below 40fps, due to the smaller refresh interval allowing new frames to be more regularly spaced.
Thank you for this very detailed reply. How do I find out the FreeSync-range of my display? It's an LG 35UM67-P. The product website states a vertical refresh rate from 56-61 Hz, but I hope that is not the range of FreeSync. On Amazon, someone mentioned values from 48 to 75 Hz (which isn't pretty much, either).
If the FPS-Overlay of Steam is correct, the game is running with 60 fps most of the time, but the tearing occurs none the less.
As near as I can tell, 48Hz to 75Hz is the correct range for that monitor. Which is moderately puzzling, since it's presented as a 60Hz display. Assuming your refresh rate is set to 75Hz, what that means is that when your frame rate drops below 48, the monitor will refresh at 75Hz and you'll get tearing. Enabling double buffered vsync would drop you to 37fps and then 25fps. If you're running at 60Hz (i.e. 60Hz max refresh rate, making your effective range 48Hz to 60Hz), then just replace 40 with 48 in my previous post.
What I'd recommend is setting your display to 75Hz and enabling vsync with triple buffering. If the game doesn't support the option itself, use RadeonPro or D3DOverrider to force the setting. The vsync option in CCC has done nothing for years now, though it's conceivable that it actually does now with FreeSync, so it doesn't hurt to try it.
Thanks again for your answer. Do you have an idea why I have tearing even though Steam's fps counter shows the game is running at 60 fps?
I tried to limit the fps to something below 60 just to see if this has any effect, but the game ignored the fps limit set in the CCC. Seems that frame rate control is for DirectX games only. Not sure if Wolfenstein uses OpenGL instead.
That title uses the id Tech 5 engine, which is OpenGL and hard-coded to a 60fps limit (without the developer explicitly allowing more with a build option). That's not 60fps with vsync, mind you, so there will still be tearing absent vsync or FreeSync.
And it's possible that FreeSync simply doesn't work with OpenGL. The only information I could find on that point was a mention that the addition of CrossFire support to FreeSync in the latest driver only includes DX10 and up titles. If you're using CF, then you can be reasonably certain that FreeSync is not working with that game. If not, then you can at least consider it plausible that GL support has not been implemented. I'd suggest contacting technical support for your graphics card and monitor, and force them to find a definitive answer.
In the meantime, just use vsync for that game.