Ive owned a 5700 xt red devil for over a year now and its one of the cleanest image producing GPUs ive ever owned. It makes my monitor a joy to own. Now with the new 6800 xt i purchased there is a clear difference in image quality and i can't understand what is causing it or why it exists. I mulled over several possibilities now. My monitor is hdmi 2.0 and the card outputs 2.1, so maybe the monitor simply is not compliant? But then i watch many videos on youtube comparing both the 5700 xt and 6000 series cards and its plain as day there is simply a clarity difference between them. Assuming, of course, that the person making the videos didn't oversharpen in one and not the others (RIS is clearly not something Nvidia has to offer), both the 5700 xt and 6800 or 6800 xt should look the same right?
Has anyone else noticed this who owns both cards? Im going crazy swapping these back and forth, but whenever i put my 5700 xt back in, its like my eyes just relax and everything feels so much better.
Its almost like AMDs doing an FSR with these new cards and not telling anyone (like running not fully 1440p). IM super sensitive to clarity of image, both in motion and static, so i know INSTANTLY what an upscale looks like.
video comparing GPU differences.
Focus on the 5700xt then the others and you should notice obvious differences in image quality. Pausing it helps too.
Things ive tried:
Ive installed and uninstalled drivers everytime, and in the correct order.
Ive tried new cables. Ive tried the displayport aswell and still notice the difference. Its much more apparent over HDMI however.
The only thing i could update is my MB bios but the update mentions nothing about graphics compatability.
System specs:
MSI B450 Mortar titanium
AMD 5600x
16 GB Gskill RAM
MSI 6800 xt trio
Powercolor 5700 xt red devil
Corsair 750 watt PSU
An assortment of SSDS, M.2 and NVME satas, which shouldnt really factor into anything.