2 Replies Latest reply on Aug 16, 2016 3:30 AM by persistentone

    Why modern HDTVs still aren’t as good as monitors for gaming

    kingfish

      "" Modern televisions are vastly more capable than the CRT-based sets of 20 years ago. Like PCs, they use HDMI, support the same resolutions (720p, 1080p, and 4K), and are often advertised as supporting refresh rates of 120-240Hz. While there are a handful of >60Hz monitors available on the market, these often command substantial premiums compared to regular old 60Hz displays. In theory, a 4K TV could make a great gaming display. The reality is considerably more complicated.

      First, there’s the issue of input lag. Input lag is the lag introduced between pushing a button on a controller, mouse, or keyboard, and when the results of that action appear on-screen. While it’s often discussed as an issue that impacts console gaming, anyone considering an HDTV for PC gaming will have to contend with this as well. One of the problems with HDTV gaming is that input lag on the best TVs is still higher than on top monitors. The fastest monitors add 9-10ms of input latency, while the best HDTV’s are around 17-18ms.

      To be clear, 17-18ms isn’t bad at all (it rates as “Excellent” on DisplayLag.com’s official ranking system), and if you aren’t playing high-speed FPS or RTS titles, you might not notice higher input lag at all. Civilization doesn’t exactly rely on fast-twitch gaming, after all. Plenty of TVs, however, don’t even clear the 40ms bar that DisplayLag qualifies as “Great.” Input lag can sometimes be improved by adjusting settings within the TV’s various submenus, but this varies by model and manufacturer. The vast majority of manufacturers don’t list or provide input lag information — it’s something you typically have to check at third party sites like DisplayLag.

      Next, there’s the issue of overscan. Overscan refers to the practice of not displaying all of an available image on the actual screen. It’s a holdover from the pre-LCD era when there was no way to guarantee that every television set would display content in precisely the same fashion. The solution to this problem was to zoom the final output slightly, creating a small border around the edges of the screen. Ultimately, modern LCDs don’t have much use for overscan, but it’s still enabled by default on many displays. Whether or not you can disable it depends on what kind of TV you have — some older LCDs may not offer the option to disable overscan at all. Graphics cards from AMD and Nvidia can compensate for overscan in software, but this may result in less-than ideal text and font rendering.

      Finally, while there are televisions that can actually achieve a 120Hz refresh rate, this varies by manufacturer. This article from CNET explains the rules of thumb for a number of companies and how to determine exactly what the refresh rate is.

      No support for Adaptive Sync / FreeSync / G-Sync

      This last point is aspirational, but if you’ve spent any time with a monitor that supports Adaptive Sync (that’s the official VESA name of what AMD calls FreeSync) or G-Sync, you’re aware of how awesome the feature is for gaming, even when you’re playing at 60 FPS. The lower the frame rate, the more FreeSync / Adaptive Sync helps, since ensuring smooth frame delivery is more important, the longer the gap between each frame. For example, 30 FPS titles deliver one frame every 33.3ms, while 60 FPS titles deliver one frame every 16.6ms, assuming constant frame latency.

       

      And much more > ARTICLE

        • Re: Why modern HDTVs still aren’t as good as monitors for gaming
          black_zion

          Another thing to consider is the pixel pitch of a computer monitor vs a HDTV. Put 1920x1080 (or better yet, 1920x1200, 16:10 baby yea) across a 22" monitor, it's going to look sharp and crisp (assuming the monitor isn't some bargain basement D grade display by some unknown Chinese firm). Take that same resolution and stretch it across a 42" or 46" HDTV, and IQ's going to suffer when it comes to the fine details of a video game. How many threads asking "why is my image (or text) blurry on my HDTV?" have we all seen across every forum?

           

          UHD may help solve it, but there's also the differences in the panels. Televisions are made to be viewed from 12 feet or more away, whereas a computer display is intended to be viewed from 5-6 feet. Televisions are made to display pictures and large font text, whereas a computer display is built to display everything from pictures to fine small text. Televisions are built to display an "OK" range of colors where deep blacks are the most important factor, whereas computer displays are built to blow sRGB out of the water and take a bite out of AdobeRGB. There's a reason a top of the range 49" UHD Smart TV, like the Amazon.com: LG Electronics 49UF7600 49-Inch 4K Ultra HD Smart LED TV is basically the same price as a top of the range 27" UHD display, like the Amazon.com: LG 27UD88-W 27" UHD LED Monitor

            • Re: Why modern HDTVs still aren’t as good as monitors for gaming
              persistentone

              Does the trend in UHD TVs towards support of HDR alleviate this at all?  Is there a way for the PC adapter to communicate HDR to the TV?

               

              What about the UHD "deep color" issue?    Is there any way for a DisplayPort 1.2a card on the PC to control deep color on a high end UHD TV?    A lot of those TVs claim to have very detailed color control, but it's not clear if any of that is being done in ways compatible with PC standards.