Normally Radeon Chill doesn't add any input lag, while keeping your FPS between the set values. I want to get the max refresh out of my monitor without adding input lag. FRTC adds input lag and doesn't work well with FreeSync, VSync/Enhanced Sync add microstutters and input lag. If I set my Chill min fps to 72, and my chill max fps to 72, it SHOULD allow me to cap my FPS without any additional input lag or stutters. If I understand how Chill and FRTC work, FRTC adds input lag because it is limiting the FPS the GPU is producing, whereas Chill limits FPS by changing the amount of FPS the CPU is requesting of the GPU (or something like that. IDK how exactly it works, but this is how it was explained to me). If this is true, I should be able to set Chill min/max to 72fp and it will cap my FPS and keep it within my FreeSync range without any added input lag, correct?
You may find this information from the support page helpful. How to Tune GPU Performance Using Radeon Wattman and Radeon Chill
It definitely seems that all cards and all monitors don't work the same when it come to Chill, FRTC and FreeSync.
I know for me I am best to just disable Chill as I get the stutters too. No matter what combination I use as long as Chill is on it just doesn't play nice. So I just use FRTC and FreeSync. On my setup they are working together. I just have to limit the frames to one bellow the max for my monitor. Mine is 75 so I set FRTC at 74. It is working okay and definitely less lag than VSync enabled.
After doing some looking around, and reading this article by BlurBusters, I found out that the way to run FreeSync/GSync with no tearing and lowest input lag is cap FPS at 3 below refresh rate, disable Vsync in the game, and enable it in the drivers. Use the in-game FPS cap if possible, RTSS if it doesn't have one. Here is the article: https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/
I run chill but cap it from 50 - 60 & leave it at that & cap my Frame limiter to 60+ no issues. I havnt experienced lag or input lag with chill enabled & its enabled for all games in Global settings. I rather have it active saves an accidental baked videocard. I found with chill running right shouldn't notice input lag if any at all but I am running on an acient monitor. I noticed a lot of chill issues come from chill being set far to high or far to low. I find a point where I rather enable it lower & give myself a 20 frames buffer for the higher limit gives the card active cooloing in theory under load all the time while gaming. Temps barely get to 60deg under heavy load with my RX560OC 4G playing The Witcher3 or even War Hammer Vermintide 2 runs silky smooth surprisingly on 18.2.1 drivers. Every1 to there own on if they enable or use it but if you monitor isnt effected by chill being activated I recomend running it as its handy to have running. Another thing is when running chill find a Game you play a lot & adjust & tweak to suit your gaming needs to what you notice. I run it & recomend running it handy as a safe guard especially on demanding games where a bit more control on cooling helps alot.
you really peaked my interest in trying to play with my settings again. I had not given chill a chance in the last 2 releases. I kicked it back on and set my min and max a couple frames in from the limits and it was working beautifully. Substantially less lag in Battlefield One.
maybe as long as you know you wont dip bellow low 50s then raise the min up to 50. See if that pushes the average up too. Just a thought. I have only done this with 2 Battlefield games but my monitor if a 60 hz with a freesync range I think of 48 to 78 to anything around 60 is really optimal for mine.
that's all I do it cap frames at a higher steady rate my normal frames are capped at 60fps as it is. The 20 buffer from high to low works well no input lag as on an acient monitor. I sit on steady 60 all the time all games the hight low settings makes chill kick in a little faster holding the temps realy steady under heavy load, its implemented so you dont get the thermal throttling. Works like a charm I even set the boys R7 360OC 2G the same way & hes pulling ultra settings with GTA5 with ease no lag. I use it I recomend it I swear by it its worth using but chill is more for Keeping thermals at optimal temps. I sue those settings as I dont wanna push my card any harder & like my steady 60fps. The Witcher 3 on Meduim High runs like butter & Vermintide 2 like butter with chill settings from 50 on low to 70 high settings. & main frame cap set on 60 no issues. I will screenshot my settings so can give a example how I use it & how well it works & I play a varity of games as well as other things. I did a lot of reading & research to get my settings where they are & a bit of fine Tuning. I run Killing floor2 on Ultra settings with all game settings manually set in global settings then tuned the same in game. Dont mock it till you try it just comes to finding where your happy it holds steady & I am still on 18.2.1 drivers to boot with that.
I have the same experience. One particular title that really heats the GPU in my GL702ZC is Vermintide 2. The FreeSync range of the built-in monitor is 40-60Hz. If I set Chill to exactly this range (with FRTC) I would expect the game to be moving between 40-60Hz. However, it hovers around 50-52 FPS. Enabling/Disabling V-Sync in game makes no difference. I though this is what the GPU is capable of. However, if I changed the Chill range to 40-90Hz, it immediately surfs on a stable 60FPS. Why is that???
Please, change the Chill algorithm so it actually hits the high-end of the range when possible. Rendering at a stable 60 FPS quickly throttles the system in action packed scenes, frame rate drops to ~50 FPS in those cases (not normal, but understandable). Me as a user can decide if system noise bugs me enough to pull the high end of the Chill range below my max FreeSync range. I should not have to set 90 Hz on a 60 Hz monitor to get optimal performance, or at least it's counterintuitive.
I have since stopped using Chill and switched to VSync. I am not noticing any input lag in any of my games, and AMD themselves have said that this is the proper way to use chill. I was unconvinced, then I saw this graph that says that VSync Off and VSync On have the same input lag w/ FreeSync at the same FPS. (at 60FPS, they have the same input lag. the only reason that VSync Off gets better input lag is that it can run at over 60fps.) This is consistent with what AMD Robert said, that VSync caps input latency and framerate at the max refresh, but doesn't add any additional latency as FRTC does. I have tried it out and it seems to be performing as advertised. See the link below for the chart I found.