cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

Steam.......HDR for PC games is a hot mess (but it's nice when it works)

Tomorrow's display tech isn't quite ready for prime-time

A few of the things I have had to do in order to get a workable version of HDR (also known as high dynamic range), the new-ish display technology that significantly ramps up brightness, darkness and vibrancy, on my PC (not including the acquisition of a fancy monitor):

– Try four different display cables
– Adjust as many as seven different brightness/contrast/colour etc shaders per game. (I have spent long, unhappy hours doing this to date)
– Manually turn on HDR on the monitor, manually turn HDR on in Windows then manually turn on HDR in the game settings. Or sometimes HDR off in Windows but on in the game then alt-tab back to Windows and turn HDR on, and off, and on, and off. Or sometimes alt-tab and alt-tab and alt-tab and alt-tab and alt-tab until HDR suddenly, randomly kicks in. When I exit the game, I have to manually turn it all back off again or Windows is unusable.
– Install an unfinished preview build of Windows 10 whose HDR isn’t totally broken on Nvidia cards.
– Almost completely lose my sense of whether anything is actually different after all of this.

The egg yolks in Final Fantasy XV were a bit shinier, though.

Important note: any HDR images in this piece are photographs on a game running on an HDR monitor, purely there to very slightly demonstrate the contrast between light and dark, as an actual HDR screenshot can’t be viewed correctly on a non-HDR screen. These images cannot replicate the sense of brightness or darkness.

HDR is, for my money, a far more exciting display tech for games than is 4K, whose mammoth pixel counts often don’t truly come into their own unless you’re using an absurdly large TV. I know others – those who demand the sharpest possible image – will disagree, but I’ve found 4K simply pleasant rather than revelatory.

HDR though – if all the stars align – boosts the sense of depth and presence to games (and films) in a way there a mere pixel count cannot. It’s a tricky thing to describe, but alas even trickier to demonstrate unless you already have an HDR screen to hand, and if you do you’re probably the one boring everyone who doesn’t know what ‘peak brightness’ means to tears.

What HDR does, in a nutshell for the uninitiated, is expand the range of brightness, contrast and colour displayed, bringing it more in line with what the human eye can perceive in meatspace. Brighter whites, darker blacks, more shades of colour and, all told, you get a more true-to-life picture. In theory.

Hitman HDR (photograph of screen)

Hitman HDR (photograph of screen)

In practice, there are four ways it can work.

  1. The above, basically, and you fall in love with it and become absolutely convinced that watching SDR, or standard dynamic range, content, is the most cruel and unusual punishment you’ve ever known, even though you were perfectly happy with it for decades before you ever heard of HDR. People who’ve gone out and dropped five grand on an enormous OLED TV often feel this way, though for me in my tiny terraced house it was the AMOLED screen on my Samsung S8 phone. (Though it probably says much that I ended up sticking with my three-year-old, non-AMOLED phone after the S8 suddenly died a few months back).
  2. You spend your time squinting at the screen in search of almost imperceptible differences in how bright the neon signs in a bar scene in Jessica Jones look, and being too distracted by this to actually enjoy the programme as a result.
  3. The neon signs in a bar scene in Jessica Jones are so blindingly bright that it’s like watching TV with a bunch of people shining laser pointers into your eyes.
  4. Everything apart from the neon signs in a bar scene in Jessica Jones looks washed-out and gloomy.

Scenarios 2-4 invariably lead to the sort of obsessive and painstaking tinkering usually reserved for audiophiles, Linux users or people who wear fascinators to posh racing events. It can, like those comparisons, end up being a gateway drug to a permanent state of gnawing dissatisfaction with the thing you’re supposed to be watching/playing/have squatting on your head like technicolour spider.

However, over in console-land, the PlayStation Pro and Xbones One and X do at least make their end of the HDR bargain work out of the box, meaning your endless voyage through The Menus Of Misery is at least restricted to the TV’s settings.

This is presuming you’ve bought yourself a TV with the right sort of HDR, and enough peak brightness to actually make it worthwhile, and oh boy is that its own minefield – which I am going to sidestep somewhat here as we generally play our PC games on monitors rather than tellies. In short, don’t buy a telly unless it definitely, definitely supports something called HDR10, and do your research to check it’s not fibbing before you by. If at all possible, get one with an Ultra HD Premium sticker on it, because that’s a proper standard and is currently the best thing we’ve got to separate the HDR wheat from the HDR chaff.

In terms of PC, though, I mostly need to scream this: “@#$&&^%*?”

windows-hdr

Now, Windows 10, which I am often something of an apologist for, makes many things that used to be tricky about using a PC an awful lot easier. But that does not change the fact that Microsoft’s ship is still slow to turn when it comes to new technologies. HDR has been officially supported by Windows 10 on an operating system-wide level since last year, but it’s been restricted to merely On or Off and, frankly, looks absolutely hideous for everything except HDR games or videos even if you do get the latter stuff working well.

We’re talking insipidly grey whites and a general sense of looking at your screen through a piece of tracing paper. Loading up an HDR video from YouTube or Netflix (big caveats, which I’ll come back to) means a burst of joyous colour in the sea of gloom, and it’s hard to say if that’s entirely because of the HDR or just because it’s the only thing on the screen that looks like stuff normally looks.

HDR then, is simply not fit for purpose on the Windows desktop yet. There are real technical reasons for that, to do with the different ranges of colour used by HDR and SDR applications that I won’t go into here because I’d probably embarrass myself, but on the other hand Android, iOS and the PS4 and XBone operating systems manage to not make everything that isn’t HDR look like crap on an HDR screen.

Resident Evil 7 HDR (photograph of screen)

Resident Evil 7 HDR (photograph of screen)

Unfortunately, even some HDR stuff looks bad too. Netflix, for instance, can only run in HDR if you have the 4K package subscription, use the Edge browser and have a seventh-generation or later Intel Core processor and/or an Nvidia GTX 10-series graphics card. Throw an HDR monitor into the mix and the total package to watch 4K HDR video is significantly more than big 4K smart TV – and, in my experience, with far less satisfying results. There are some issues between Windows 10 and Nvidia drivers when it comes to HDR, with the latter laying the blame squarely at the feet of the former.

In the most recent public update to Windows 10, you can expect this mean colour and brightness issues with HDR all over the shop, including a washed-out Netflix and deeply unpredictable games. YouTube HDR was OK, as were downloaded HDR videos played in VLC, but Netflix was lousy and most of the games I tried looked as pale as a student on results day. I also had to cycle through four different, increasingly expensive HDMI and DisplayPort cables before I found one that worked – no pound shop treasures will do the trick here.

My dream of just flicking a button and being presented with a glorious wash of new colour died on the spot.

There were two semi-solutions to this, neither of which was quick or easy, and neither of which were ultimately that rewarding. The first was to opt into Windows 10’s Insider builds (you’ll find the option in Windows Update), which, after a lot of waiting and worrying, installed a preview version of an upcoming version of the OS, replete with improved HDR support and which played a little better with Nvidia cards. Clearly, being on a glorified beta of an operating system isn’t ideal, but, for what it’s worth, I’ve had no stability problems after the better part of a week with it.

The new build of the OS offers a new slider that enables you to boost the brightness of SDR desktop content while HDR is turned on, thus reducing that pallid grey affect. It still looks bad, but it is at least usable where the live build is not. There are a lot of issues affecting where or not this situation can ever be made perfect, which relate to the different colour spaces of HDR and SDR and a type of, sort of, colour compression required over the current version of HDMI in order to retain the bandwidth for a 4K/60fps signal, but I live in hope that something better could come along later. HDR is a lot newer in monitor-land than it is in TV-land, with few supporting screens available at the moment, but that’s beginning to change fast, and hopefully Microsoft will roll with it as it does.

Netflix's Jessica Jones

Netflix’s Jessica Jones

The second thing I had to do was dive deep into Nvidia’s control panel and manually fiddle, across the space of hours, with the brightness, contrast, gamma and vibrancy sliders until I found a not entirely happy medium between totally washed-out and dark or light areas being blown-out, masking all detail. You would not believe how I long I spent staring at a paused image of Jessica Jones’ black hair, trying to find the teeny, tiny point at which it stopped looking dark grey but before all the follicles disappeared into a sea of obsidian. This was not, I assure you, a sexy time.

Even with this sweet spot found, the likes of the aforementioned neon bar sign certainly popped in a way it didn’t in SDR, but everything that wasn’t a light continued to look less vibrant than watching the whole thing in SDR did. I felt a bit like an 80s dad who’d spend a marathon weekend tuning his hi-fi system just so, but who then had to pretend to everyone including himself that it actually sounded any better.

The same is true of many HDR games. I know I’ve been round the houses in getting to this point, but I wanted to establish a context for how miserable setting HDR up on PC is even before you get to the point of firing up a game that supports it. There aren’t many HDR games yet – but they are snowballing and I think, by the end of the year, we’ll be close to taking it for granted that any big new release will have at least some version of HDR. As it stands, I divide HDR PC games into three categories:

1) Those with their own internal brightness settings to help you get ’em looking suitably eye-searing without everything else looking grey or blown-out
2) Those that have little more than an HDR on/off button and leave the brightness/colour fiddling to Windows and/or graphics card drivers.
3) Those that are only really HDR in name only.

In fairness, there aren’t too many of the latter – hellooo Chess Ultra – but I certainly have a background worry that, until HDR really beds in, we’re going to see a lot of cheeky stuff in which lamps and candles are a teeny bit brighter but the overall scene doesn’t really have any of the added depth or vibrancy that good HDR can bring. I’ve seen plenty of 1) and 2), however, and both have led, in their own ways, to slider-based misery.

Resident Evil 7 HDR (photograph of screen)

Resident Evil 7 HDR (photograph of screen)

Those in the first category, I generally get much better results from once my maudlin task is complete, not least because their settings menus offer reference images with instructions about what part of the picture should be visible or invisible and the like, so I’m not trapped in a frenzy of alt-tabbing. Assassin’s Creed Origins and Resident Evil 7, for instance, have sliders both for peak HDR illumination – i.e. how bright the brightest sources of light can get, which is less to do with personal taste and more to do with how bright your screen can go before the source becomes a wash of white (although ACO’s slider, at least, is clearly geared toward TVs with brightness levels extending way beyond 1000cd/m2, which most HDR monitors simply can’t do) – and for overall scene brightness. The latter behaves similarly to the gamma settings familiar to us from most PC games but does more with how white whites are, and is absolutely crucial to avoiding the washed-out effect that has blighted me so.

In ACO and RE7, I can find a sweeter sweet spot than I can with the Nvidia settings, and additionally they seem to override whatever Windows is doing. The games that do this are doubly good because they’re not dependent on flicking on Win10’s HDR first, but instead automatically enter and leave HDR mode. This has not, however, saved me from having to find per-game settings for seven different sliders – in-game HDR peak illumination, in-game HDR brightness, in-game general brightness, Nvidia brightness, contrast, gamma and colour vibrancy – in order to find something I’m more or less happy with. The effect is nice – the punch of ACO’s big sky, with a sun almost too bright to stare off, and the almost searing light of a lamp or fire amidst RE7’s perma-gloom – but I hesitate to say that it was truly worth the hours I put into it.

Category 2 games I’ve been less happy with the results of, as they require Nvidia’s settings to get rid of the grey, and the only advice here on how the scene should look comes from your own eyes. I’ve found these settings to be a bit of a blunt instrument, tipping the scene too easily from inspid to a sea of black. Hitman and Final Fantasy XV have been my main test beds here, and though I can end up with some lovely lighting effects in the smoke flares and hanging lights of the former’s Marrakesh level, everything else still looks paler than it does in SDR.

Final Fantasy XV HDR (photograph of screen)

Final Fantasy XV HDR (photograph of screen)

FFVX, meanwhile, I can fiddle into offering me strong blue skies and dazzling headlights, but something’s wrong somewhere, as alt-tab or turning off the monitor often turns HDR off entirely and leaves the colours unpleasantly desaturated whether you’ve got HDR on or off. Multiple alt-tabs will randomly fix it, but it’s very much a crap shoot rather than a science.

I’d gone into this whole HDR thing in the first place specifically because I wanted to see FFXV’s beautiful, glistening food look as beautiful and glistening as it possibly could, I’m not sure I can say the quest was entirely worth it. Especially not when compared to the friend who simply hooked his PS4 Pro up to his 4K TV and was immediately rewarded by super-food.

Something else I should mention that, for any of this to work at all, I need to manually switch the monitor into HDR mode with its own controls (and we all know how horrible monitor buttons are). Otherwise all is pale and grey. Clearly, this will differ from screen to screen, and certainly in telly-land there’s more going on in terms of automatic mode detection and switching, but all told running each new HDR game is currently a gigantic faff. What I’ve tended to use more is the monitor’s ’emulated HDR mode’, in which it makes its own best-guess on light, dark and some colours in SDR games and amps ’em up in a way that is less convincing than true HDR but is just a one-button fix. Some HDR games, like FFXV and Hitman, actually look better to me run in SDR with emulated HDR than they do after my hours of tinkering for full HDR.

Needless to say, your mileage is going to vary enormously depending on both your screen and your graphics card. Radeons, I hear, play a fair bit nicer with Windows 10 right now, although you’ll be sacrificing HDR (and 4K) Netflix there unless you have an Intel Kaby Lake or later CPU.

Meanwhile, the screen I’ve been loaned to do this test is a BenQ SW320 30″ 4K IPS monitor. It a very lovely screen indeed, designed for photography rather than games and with a price to reflect that, but the colours and viewing angles in SDR games are certainly some of the best I’ve ever seen, as well as its huge size making 4K far more worthwhile than it usually is on the desktop. I shall be extremely sad to say goodbye to it, and it has made my general resolve that 4K isn’t really worth it begin to falter. But it is a late-2016 panel, and as such its HDR pre-dates the tech’s move into the TV mainstream mid-way through 2017, even though it does boast the superior (and vital) HDR10 standard.

assassins-creed-hdr

I have had some nice results in HDR- the blinding sun on ACO and forceful glare of RE7’s infrequent light sources particularly stands out as giving me a startling sense of depth and contrast – but the problem is its relatively commonplace brightness of 350 nits. That is more than you need for any SDR use unless you’re basically playing outdoors, but, by contrast, truly decent HDR tellies start at 1000 nits, go as high as 2000, and 500 is considered the bare minimum. There are good reasons for that, in that sitting ten feet away from a very bright telly is very different to sitting 30 centimetres from an equally bright monitor. Even at merely 350, sitting with my face stuffed into the SW320 when it’s properly HDRing or its SDR brightness is cranked up makes my eyeballs throb unpleasantly. Still, more consistent settings, in the OS and the games, will help to sort this stuff out in time, as will newer monitors with more fulsome HDR settings of their own (it’s simply on or off on this one).

Unfortunately, monitor-land is currently denied the best of the best, which is to say the OLED panels used in higher-end TVs, HDR games and films on which offer an almost painted-on effect that have PS Pro and Xbone users in raptures. There are more HDR monitors coming out month by month – we’ve tried and so far failed to get others in for review apart from the BenQ EL2870U which Katharine reviewed and is currently our only 4K candidate on our best gaming monitor list for that very reason, but will be revisiting this subject once I can. Right now, though, my feeling is that all our powder should be kept dry because the situation’s still evolving and you could very easily regret spending £700+ now when you could have had something better in a year’s time.

This is even before we get into stuff like how HDR or framerate higher than 60fps is a hard choice due to cable bandwith, or how different and upcoming HDMI and DisplayPort versions change things again, and… Look, we’ll get there, OK? HDR is nice, but on PC, right now, it just isn’t showtime yet. If you’re not unhappy with how games look on your monitor right now, hold fast until later in the year and let’s see where we are then.

HDR for PC games in 2018: Is it worth it? | Rock Paper Shotgun

0 Likes
4 Replies

The problem with the current "HDR" standard is that it's not -really- a standard. In the words of digit: https://dgit.com/4k-hdr-guide-45905/

It's not much of a standard if the manufacturer themselves determines if they're within the spec, and then only technically part of a spec. Heck, there's a lot of people out there using garbage 6+2 bit TN panels (because it has 240hz refresh rate blah blah blah it looks like garbage but those hertz yadda yadda yadda you suck at 60hz blather blather blather), and those, under the current HDR spec, they could be labeled as HDR if they had a strong enough backlight. HDR10 solves that problem by requiring 10 bit color and 1000 nits (something I'm still a little ticked off at LG about when it comes to my monitor but meh), but really sit someone down in front of a good quality IPS 8+2 or 10 bit display and compare it to a 6+2 or 8 bit TN or VA panel, and there's no contest.

And I can say this as someone who went from an 8 bit IPS display to an 8+2 bit IPS display, and was floored at even the difference at the wallpaper.

Also I forgot to add that with VA and ESPECIALLY with TN, there is a color shift if you're even slightly off angle, with TN panels it's insane and with Samsung it's a bloody nightmare (I speak from experience), it's just the limitations of the cheap panels, whereas with IPS displays, even if you view them at severe angles, there's no color shift. You might lose a slight bit of brightness, but especially with larger monitors (24" and above) and televisions, the benefits of IPS are spectacular. You never know how much detail you are missing until you go IPS. OLED is trumps of course, but it's 4x the price of IPS right now and still has a question mark lifespan, not to mention it won't be coming to computer displays for a long time as they're too static and OLED requires dynamic or it dies.

0 Likes

This is also why IPS tends to mitigate the brightness flickering a lot of users complain about with freesync.  TN and VA panels often can't maintain the same gamma levels across the entire refresh range.  That gamma drift is then brought to the for when the LFC limit is crossed.

I opt to run my monitor at 8-bit 144 Hz vs 10-bit 60 Hz.  I can't say I noticed much of a difference between 8 and 10 bit at those settings.  I did calibrate my display at 8-bit 144 Hz using displaycal, and I have to say the results are pretty solid.

ajlueke
Grandmaster

I think the post highlights the issues we will continue to see with getting technology like HDR mainstream in gaming, especially PC gaming.  Thanks to Gsync, and Freesync, monitors supporting a wide range of refresh rates has become the norm, and rightfully so.  Image tearing is pretty awful, as was the input lag associated with enabling VSync.

The problem now, is most displays have difficultly keeping a consistent image quality across all the refresh rates they support.  Which can be mitigating to a large extent by calibration, something most end users simply aren't going to do.  Now asking a display to support even more color data and then still reproduce that accurately across different refresh rates is beyond what most display technologies can deliver.  Not only that, but the bandwidth available in many of the link cables simply aren't up to the task.

You want HDR10 certified quality and a 40-144 Hz freesync range?  Probably not going to get that.  The user has to decide what is more important, gaming at much higher frame rates, or more modest frame rates caps with full HDR10 reproduction.  HDR in movies and consoles is much easier to execute.  Movies are always shot at 24fps (sometimes 48 with digital cameras), so the display only has to reproduce the enhanced color at a fixed, and relatively low frame rate.  Consoles games as well can be setup by the developer to maintain a 60fps standard frame rate.  This again allows for a fixed and relatively low frame rate.  But for the PC gamer, HDR10 needs to work whether the user is gaming at 60Hz, vs 144Hz on their display.  Variable, not variable refresh, 1080p or 4K.  That is probably just too much to ask right now.  

0 Likes