I used a hd6950 for 4 years, default color settings, hooked to a monitor (samsung bx2450) and tv (sharp quattron lc40le810un), both calibrated to perfection (from a personal perspective). After upgrading to a gtx 980, I was quite disappointed to find out how dull the colors were, even on Full rgb. Even after calibrating the colors though the control panel and monitor/tv, I can't seem to get them looking quite as good as they did on my old amd card. On top of this, there's apparent banding on backgrounds and blotchiness. Almost felt like a 600$ downgrade.
I thought it was just me, perhaps my eyes were deceiving me or I got a faulty gpu, but after a bit of research it seems to be a common complaint with nvidia cards. The difference might be very subtle to some and escape others, but if you place an example from both amd and nvidia side by side it becomes inescapably apparent. Here's a site I found with a few side by side in game comparisons (290x and titan). http://pokde.net/pokde-blog/nvidia-gtx-titan-vs-amd-r9-290x-image-quality-comparison/#prettyPhoto
So what exactly makes amd cards better in this department? Does it use a form of dynamic contrast among other things, higher saturation, improved blacks, etc, by default to improve the color scheme?
I don't know the reason, but this has been my experience as well. As you say, some may not even notice it. In photography terms, it's like comparing Kodachrome (AMD) to Fuji film (Nvidia)
I purchased 290x's, ended up wanted a little more power, so then purchased 970's. Right away my wife noticed a picture difference on her machine. After a while the washed out look bothered us, so we returned the 970's and picked up 390x's.
I researched why extensively. Almost all people will say its because AMD turns on all the goodies on the card for you and NVIDIA does not. Such as Full vs Limited RGB, turning up the color, sharpness, etc... This is not true. The only setting AMD turns on/up for you is a higher color bit.
NVIDIA uses alot of compression techniques. If you notice on their newer cards, they reserve a half a gig leaving you 3.5 gig. This is reserved so they can store compressed textures and colors. After all the compression the end result is a washed out/blurry look.
From my understanding its not as noticeable on lower end displays. I'm still baffled how this is not more widely known in the industry. These days its all about the frames.
i have notice something similar when i watch a game on a friends computer, nvidia has a more softer image (amd has more of a harder image)
and it seems in your case the way amd's display engine handles colors are superior to nvidia'
Fun fact: AMD used to make chipsets TV's and Set-Top boxes and that UVD is a direct descendant of the AVIVO in those chipsets
I got the same experience as well, after i sell my HD 6970 and got a GTX 680 my image colors got wash and blury, but after i sell GTX 680 and got R9 290 the colors got better and image not so blury. The AMD image quality is much better than nvidia.
Yup. I upgraded from a 5770 to a 560Ti, but found the color quality to be completely unacceptable - ESPECIALLY while playing videos. Finally figured out how to enable the full color range (registry hack), but still could not get my monitor color to be correct.
I enjoyed the smooth gaming experience and really enjoyed the rocket demo, but I could just not get accustomed to the colors or the way it handled dual monitors. At the same time, in game, I started realizing that I couldn't see things at a distance anywhere near as well as my 5770 could. I swapped to a 7870XT very quickly after that (which is now in my HTPC) and I upgraded to an R9 290.
Having seen the output on a 780 and a 980Ti, I can very much tell you the situation has not changed - at all. Some online forums have discovered that you can get close to the AMD level of quality by maxing out the nVidia control panel quality settings, but you take a good 10% performance hit in doing so.
Nvidia cards are good, but the image quality is low... an will be lilke this for ever, because they don`t give a cow about this )
the half gig is not really for that. They just messed up their hardware design and lied about it to sell more cards. They don't have to separate the VRAM from the main pool for that kind of purpose.
Sorry for reviving original post I just googled my problem and this thread was first in results. I'm messing right now with new GTX1070 and trying to set up desktop colors and cannot achieve pleasing results. With my previous AMD cards I didn' t need to touch a single color setting out of the box, everything looked perfect. The difference in color reproduction is most significant on IPS/VA panels. Eventhough I'm using TN one right now (BENQ X2430T) it is one of the "better" TN panels and clearly see the difference. Image on AMD cards is just more pleasing to eyes. I'm considering to replace 1070 with RX480 right now because of this.
I have found the solution and since I've got here searching for this issue, I thought its better to leave it here as well.
You need to Activate the Output dynamic range to "Full", not "Limited". When I've changed from R9 270X to GTX 1060, I've got extremely disappointed about the difference... colors where more pale, the pallet color also was limited and no contrast at all. I have tried to increase contrast from Nvidia Control Panel, but it was just a forced contrast.
So, here is the solution:
Go to Nvidia Control Panel - Display - Change resolution and Use NVIDIA Color settings - change Output dynamic range to Full.