10 Replies Latest reply on Jan 2, 2018 4:55 PM by leyvin

    30fps looks bad on 75Hz monitor

    faqu

      I know that this "issue" isn't related to AMD graphics card or to my monitor , but why 30fps tends to looks worst on 75hz monitors than on 60hz monitors ? I've done some testing myself , and 30fps gaming looks way smoother on 60hz than on 75hz (same monitor) . I personally couldn't find any info on internet about this , so i decided to ask this question here

        • Re: 30fps looks bad on 75Hz monitor
          kingfish

          What exact monitor do you have? Are you referring to the "Max Reported Refresh rate" of 75hz ?

          • Re: 30fps looks bad on 75Hz monitor
            goodplay

            Maybe your looking for info on how additional/less frames need to run to make up full frame ratios to refresh rate and its effects ?

            Some "keywords" to get you started, motion interpolation/soap opera effect/3:2 pulldown.

             

            Edit: pulldown(not dropdown).

            1 of 1 people found this helpful
            • Re: 30fps looks bad on 75Hz monitor
              leyvin

              It looks worse for a simple reason... Digital Displays (TFT, LCD, OLED, etc.) do not Scanline Draw Images, but instead all of the LED Points are updated Simultaneously every Refresh Cycle.

              Now to keep this relatively simple, I won't dive into Adaptive Sync,. which is certainly an interesting solution to resolve the Fixed Nature of Digital Displays... still for the moment I'll focus on your specific issue.

               

              That being that for some reason 30 FPS "Feels" Slower on 75Hz than 60Hz... well it feels that way because actually it is.

              As I said initially Digital Displays aren't Rendering in Scanline, rather they're Displaying the Entire Screen every time they Refresh,. this means that 1080p at 75Hz the Display is updating at 75 Frames Per Second; and no this isn't just using Gaming Terminology, that's LITERALLY what's happening.

              If the Game you're playing is running at 30 Frames Per Second (I assume either due to 30 FPS Limiter / V-Sync / On Average) then the Game itself is Refreshing the Display Buffer every 2.5 Frames the Display is Updating.

               

              Now without V-Sync on this would actually cause a fairly consistent Mid-Screen Tear to be present,. but with V-Sync Enabled... this actually gets a little more complicated as it WONT Dispatch at Partial Frequencies... thus what you'll actually be getting is a 3,2 Frame Interleave Delay.

              As in the first Frame will display for 3 Frames, the second Frame will display for 2 Frames. Now on the surface sure, 30 Frames are being Displayed but the alternating inconsistency will make it FEEL as if you're only getting 25 FPS (i.e. Slower).

              Switching down to 60Hz, and 30 FPS will be at a 2:1 Ratio,. resulting in it feeling more Fluid / Natural as it were... thus will feel Faster, Less Latency, etc.

               

              I'd assume the Display still supports 60Hz,. so just run it in that when you're Gaming.

              1 of 1 people found this helpful