This is only a partial solution, though an absolutely required step. You'll likely still notice that nVidia's handling of color, in general, is inferior (or, perhaps, just different). This owes to ATi's strong multimedia heritage which AMD inherited and providing accurate color representation.
It is nice to see that this is no longer a registry hack, though. Even with this set, I can tell the difference in video playback and, especially, older games.
NVIDIA uses alot of compression techniques. If you notice on their newer cards, they reserve a half a gig leaving you 3.5 gig. This is reserved so they can store compressed textures and colors. After all the compression the end result is a washed out/blurry look.
From my understanding its not as noticeable on lower end displays. I'm still baffled how this is not more widely known in the industry. These days its all about the frames.
I noticed this way back '02, believe it or not. For some reason, my nVidia card at the time (I had ATi, 3dfx, and nVidia cards all back then) displayed markedly less color definition and one color in particular that I still remember--the nVidia GPU displayed an orange and red that were pretty much indistinguishable whereas both the 3dfx (V3/V5.5k) and ATi GPUs (can't recall...;)) at the time displayed a clear difference between orange and red--running same exact monitors, settings, computer and software and games. No internal fiddling with the nVidia GPU (TNT/TNT2, IIRC) corrected it. I ran a lot of experiments like that back in those days--and as I've always been far more interested in image quality than in frame-rates (as long as I can play smoothly and without stutter I almost don't care about frame rate.) So I dropped nVidia at that time and never bought another, sticking with 3dfx and ATi and then shortly thereafter when 3dfx was absorbed by nVidia I went exclusively to ATi and reluctantly dropped 3dfx out of necessity--nVidia did *nothing* with 3dfx except to kill the brand altogether--which I thought was a real shame. Not much imagination from JHH, I always thought.
You make an excellent point about nVidia and its use of compression, but also nVidia "optimizes" everything even today for frame rate performance above every other consideration--I've seen the same "washed out" effects on other systems so I know what you mean. nVidia is a benchmark company, you might say, and hinges its entire marketing effort around frame-rate benchmarks in 3d games. It's been my experience with nVidia since '02 that the company cares far less about image quality than it does about frame-rates, and I think that is what lies behind this phenomenon. For example--the company fought against FSAA publicly for a couple of *years* post R300 from ATi--the ArtX design ATi bought that changed everything about 3d gaming, imo. Anyway, somewhere around '04-'05 nVidia marketed its first competitor to R300 and its ATi descendants. By this time nVidia had magically decided that FSAA was in fact *the* way to go as they finally had a product capable of doing FSAA at > 3 fps...;) Anyway, long story short: I've been with Ati/AMD since '02 & the R300, and at no time since has nVidia given me reason enough to pick up another nVidia card. But that's just me...;)
I don't know whether nVidia has fixed all that stuff relating to IQ by *now*, midway through 2018, but I would suspect the company has fixed most of it. Reviewers tend to say these days that they cannot discern the difference between nVidia and AMD GPUs in terms of image quality--but I also remember that they said much the same thing way back then when I could see a stark difference...;) So I remain skeptical about nVidia. I guess there is a type of GPU customer who is far less discerning about IQ--color fidelity and display quality, etc.--than he is about frame-rate performance. But, as I say, that is not me, and never will be...
Great thread, Zenith, btw. And I've enjoyed everyone's responses...;)
That is NOT the full story. There are two locations in Nvidia Control Panel to set Output Dynamic Range:
1) what you showed, and
2) Videos > Adjust video color settings -- with the Nvidia settings>Advanced tab
Usually, people leave it up to the video player settings for the Adjust video color settings, but that has Nvidia settings defaulted to limited.
Getting Nvidia's output to do how it is supposed to has been a constant headache since I've picked up calibration as a hobbyist/enthusiast. I have an i1Pro 2 and an i1Display Pro (i1D3). Literally trying to get the output correct has been a PITA. You also have MadVR's tool to force full or limited dynamic range as well.
I have primarily been using MadVR and ArgyllCMS based tools (DisplayCal and HCFR). MadVR has been invaluable helping to create an HDR-10 3D LUT for my Vizio P50-C1 which I use as my main display (the AOC AIO android was mainly to have a cheap touch screen on a display arm to my dominant hand).
But I'm considering picking up an AMD card for my next card if the color is more accurate. Plus, my 980 Ti, although decent, is starting to get long in the tooth. But, for $315 around 2016, it was worth the money.
My experience is opposite of it. I've had HD 4650 before and the colors were pretty okay. Then I've got my GTX 580 and the colors were too vibrant. Then I've changed back to HD 6970 again for some reason and the colors weren't as saturated as GTX 580's. I don't know how the hell people are experiencing the opposite of what I've been experienced. But it doesn't feel that the colors are vibrant, saturated anymore. Maybe just because I don't use AMD since long time ago.
my HD6950 was fine but not as good as previous ATI/AMD cards,
i noticed the difference everytime, even more so after experience working at a graphics company.
my R9 390 and RX580 had excellent color.
my GTX 460 had average color but the 8800 GTX was very good.
Now i have a GTX 1070 and it seems fine (or maybe i don't care anymore)
but my preference will always be with AMD cards as the color is always better.
certainly the R9 390 was the better choice over the GTX 970,
problem was - many R9 390's were unreliable (black screen after 3+ years)
Oh, really? Did you have better colors on previous models? But I've read about that AMD has better color quality because of their media experience or so which I don't remember something has to do with displays. I've had much better colors on my GTX 580 over HD 6970. Now I have GTX 1080 which its colors doesn't even impress me on the exactly same game back to 2011. The colors were too shiny back then. I wonder what makes the difference. If it was better colors on newer models it's ok but newer models worse colors is a problem for me. Not just color. Also textures are a bit clear and realistic.
Black screen after 3+ years? Damn. Same as 5700 XT today.
i'll go with your observation of the 1080, as i mentioned, i give up caring as the 1070 was the right price and approx 25% faster than the 580.
upcoming cyberpunk 2077 on a limited budget over-rules color preference
5700 black screen OUCH, my 390 black screened after 3 and a bit years.
yes i'm checking the card tear downs on youtube before buying now..
the sapphire r9 390 wasn't good...