I just got a 6950XT, recently coming from Nvidia, I had a 2080TI and the performance uplift is HUGE, that being said, even with the massively increased performance, I am extremely angered and agitated with AMDs incompetence.
Old/some Games look absolute garbage, If a game doesn't natural support antistrophic filtering, ambient occlusion, Texture filtering etc, and it's not DX9 (even DX9 GAMES barely seem to work with these settings forced in CP, I swear they do nothing at all) nothing you can do about it, you have to play a disgusting looking sad exscuse for a game. On Nvidia you could Force, AF, AO (AMD doesn't even have that option), LOD BIAS, Texture filtering etc etc. On DX9,10, 11, ,12 Make any game look amazing.
Is this a Joke? Is AMD trolling? I finally give them a chance and I can't even put AO in a game to make it not look like Dog Dirt, while Nvidia's been doing it for a decade. IT'S Almost like it's on purpose, extremely low power draw, compared to Nvidias top contender with around the same performance, half the price, yet doesn't come with basic functionality and driver freedom/customization like Nvidia. I knew it was too good to be true.
Simply put an Nvidia GPU in and turn on/compare said settings and you will immediately see the stark contrast, it's simple. You won't have to go off of anyone's word and you can see for yourself.
We simply have different views aswell. If I'm paying for a premium 1000$ + on a so called "direct competeter" GPU I expect it to have ALL of the features as the one it's competing against, not half baked dog **bleep**e. If I want to force HBAO in a game so it doesn't look like a miniature set made out of plastic and cardboard, I should have the ability. Even if the game does natively support these things (which isn't many), Nvidias forced settings usually look way better than most in games settings and I've seen many others say the same.
Although I do notice forcing AF in Nvidia and the game gives you an almost 32X AF, Because I can tell the difference when turning on option off.
Anyways, just put an Nvidia card in And test these settings and you will understand
nvidia's drivers and features are all 50 year old AMD stuff,
try typing in registry or turning on TRUELIGHT TRUEAPERTURE and TRUEPHOTO and TRUEDSLR or TRUECINEMATIC and TRUERENDER if it doesnt work you may need to put mode on the end, you see the OS is fake and you have to make the AMD display driver and function separate and modular discrete from the operating system or you cant use the function if it does it pretends to at like trillions of trillions of times worse.
also because nvidia steals or copies or borrows AMD's code when its 50 years older and open source (see GPUopen.com) as linked to by the AMD website they disable as much of AMD's drivers and software and make it ridiculously hard to turn on or enable else you notice they dont have hardware and intel and nvidia arent a computer. For example nvidia's 10bit isnt 10bit.. its all fake. AMD has true 10bit like DSLR RAW .. its to do with bit depth and computer language specifically designed for computers you cant have a calculator or a computer without it really.. its called ASCII. nvidia and intel use like 8bit ASCII. 0-255 when in the 60's there were thousands of them today probably millions. they're a unique code for assembly and compiler for specific tasks or instructions or functions. So if you take a regular calculator its got all these buttons above the numbers for maths and functions and equations and graphs or whatever.. like Pi or Sine and cosine or squared. nvidia and intel use the cheapest of cheap not quite a calculator and run the same code as microwave ovens. 10bit is 1024 not 255.. and my 256bit DDR4 RAM bus bandwidth cant even be used let alone my thousands of thousands of bit depth AMD 5700xt. They went from 8bit to doubling it and pretend 16bit.. instead of each bit being brackets multiplier with to the power of the number of bits, each bit multiplied in all possible possibles with every variable. intel and nvidia just takes like 8bitx2 and pretends its 16bit.. but it IS NOT! so if your AMD CPU has heaps of cool physics and instruction sets and true light or whatever built into it, it needs way more complex maths than a regular or even a scientific calculator uses.. it needs thousands of ascii and bit depth and numbers.. if your computer and graphics card cant do computer language you know its fake! to compute is to use a tool to perform maths! its in the dictionary when a mathematician puts words like infinity and advertises it as being better than a super computer with INFINITYFABRIC and infinity cache.. advertising laws mean youre a criminal retard if you think a mathematician with a doctorate PHD in mathematics would use fake numbers and fake advertising on the box, just the more core count and double the bit depth with 4x the bandwidth for the same stick of RAM means AMD is trillions of trillions of i cant even begin to form words for the numbers better and it literally truly has infinity for anything that fits into the cache and you can cache the CPU and bus hardware communications and make them waveform and compress the true light sine waves for visual and audio and massively improve quality with additive light and make your systems bandwidth skyrocket! you can game trillions of times higher quality on an AMD mobile phone from years ago. The device must support freesync and if its within the last 20 years or so it will have 10bit HDR and dolbyvision. Uhh i got a cheap LGV60 off ebay they were 400 retail new 4 years or so ago? can get one about 200USD online or cheap refurbished. you will ray trace on a battery powered AMD mobile phone all damned day while realtime rendering so glassy glass and wet glassy water and candles that smoke with ATMOSPHERE like air in the air.. the reason that AMD doesnt give a **bleep** is their hardware device is a hardware device. like a toaster oven or a bluray disc player.. you put the file in and the hardware spits out crispy toasty graphics and sound. seriously directtransport or directtransfer exclusivemode copy that data into your device in the correct formatting and be amazed! it works for the first time in history hopefully..
so yeah anything and everything nvidia's software does is old AMD stuff like playstation 2 era at the most recent or pre 1960s i really dont wanna have to say how old some of this stuff is.. pythagoras and newton are like 2000BC and 16th century and nvidia literally cant. AMD can!!! with your antialiasing and imagequality for AMD set TRUETEXTURES TRUEMATERIALSX and use IMAGEQUALITY2 and TEXTUREFILTERINGQUALITYHIGH.. use things like 8xEQSUPERSAMPLING and 536576tesseractionx64 or tessellation whichever way you wanna word it also worldspacereservoir is a good one use like maybe 256 worldspacereservoir. uhh you will also wanna do 536576compressonator and copy and paste compressonator and smartaccessmemory and infinitycache and infinityfabric several times each in both config.ini named text file and in registry. .. well just doing that sorta stuff you might begin to gain a clue.
AMD had all this predirectx 9.. you can see aperture is a directx9 function. the ps2 used ray tracing for things like lightshafts and god rays (ray marching more intense) well nowadays its used by mobile phones for 3d colourmatching those 3d face swap filters over the webcam feed so you can have 3d cat ears looking in the room or VR vtuber stuff. ray tracing is 1960s stuff.. AMD does like the DSLR photoshop and printing stuff since magazines existed. all colour is light analog tv's used a lot of light.. figure it out. AMD is way older than intel and nvidia and technically existed as different companies which suffered attacks from criminals and legal issues and other reasons shut down like BELL computers and old school IT tech firms from IBM and many others formed an all star nerd dream team to save the planet from criminal retards. See people hear computers boards are lined with gold and they're worth millions and the size of a house because super computers used those vacuum tubes with superconductor USD100k per gram inside them todays prices. a luxury house used to cost 20k you could buy cars for a few hundreds so of course dumbos were going insane thinking they could put a million dollars under their arms and run away with it not realising the expensive stuff was as big as a house in part so you couldnt shove it down your pants and they could follow it by plane or satellite. you can make hoverboards with the superconductor stuff it defies normal physics which is why its super and it needs mountains of ores mined for a tiny bit. its the unobtanium **bleep** in the blue people avatar movie. people even murdered countless people for TV's and the database DUMMY TERMINALS at the post offices and banks and other places because they heard its a computer or its got computer parts in it thinking its worth millions. but truth is the news and media was hyping it up hoping to gain interest in learning coding and studying computers and helping the demand in growing industry as they could do cool photoshop printing stuff and other things and save the planets trees from becoming paper and all kinda neat stuff like sattellites and live broadcasts. so AMD made the best most expensive super computers not needing liquid nitrogen cooling and superconductors.. they did super computers at room temperature and gave them all the best free software which criminal retards all shoved up their rectums and let a tiny bit leak back out to us every decade or two. So yeah.. uhh.. enjoy your overpriced bull**bleep** with more made up fake software for you to toggle various mediocrity.
@thanatoast , personally I would say Anti-Aliasing matters a lot, aliasing is especially bothersome in games modern-ish games that did not have an AA option; for example, Unreal Tournament 3 looks much better and clearer forced AA.
@B00mBOXX , I think there might be some graphical bugs in the RX 6000 drivers then, because 70% of games have clearer textures to me on Radeon. But again I have the 3GB GTX 1060 and I have read Nvidia have used texture compression techniques in the past.
Although I do agree that Nvidia's contrast looks better in games which uses a lot of self-shadows.
Furthermore, have you tried Virtual Super Resolution? It really looks quite a bit more crisp than Nvidia's DSR while still maintaining very smooth edges. You can het driver forced super sampling working in DirectX9 games using the methods I described early in the post, but it is very CPU single-thread heavy like the QHD option in the Witcher 2 was/is.
Remember when trying in-driver anti-aliasing, first choose "override in-game settings" for the anti-aliasing method, then I recommend not using more than 4x eq, and try the adaptive multisampling, but if you have a beast CPU you can choose SuperSampling.
Also remember to disable fullscreen optimizations and to disable in-game anti-aliasing.
Personally, I would not force Ambient Occlusion if a game does not have its' own option.
Lastly, it is definitely not guaranteed that AMD will fix these issues, I do not think it is due to laziness, but definitely negligence and orders from the top. I suspect they are not allowed a lot of freedom to work on bugs and think they know best by themselves, which is why I have an urge to try modded drivers at some point.
@hitbm47 @Yeah, I don't think it's all due to the lack of settings, it is a major factor though, I immediately noticed Destiny 2 looked like absolute Ass, for example, it was the first game I played on this new GPU. It has worse ambient occlusion and In Nvidia you could Force HBAO+, but also the textures and sharpness and clarity just looked much worse and more fuzzy
I'm sure it's a mixture of Driver bugs and Lacking forced options in the control panel but even Nvidia on Default CP settings looked better In Destiny 2, While also having the render scale at 100%, When I had to put it on 120% to make it look comparable on AMD. Which lends credibility to it maybe being some driver bug? It's like the LOD/clarity/sharpness, even up close is far inferior in some games and in others it looks better than it did on Nvidia, lol.
@B00mBOXX , I understand and it sounds like it might be something not rendering correctly, this has happended to me two times before, but thankfully I was able to convince AMD to fix the shadow issue in RAGE 1, and a lot of people complained about bright lit up rooms in Wolfenstein II which AMD also thanfully fixed that time.
So maybe let us find a game some of us have in common with you, then we can compare screenshots and report it to AMD. Unfortunately, I do not have Destiny or APEX installed.
@hitbm47 Yeah we could try that, also Now that I'm looking, even on Wallpaper Engine, the wallpapers look more blurry and less detailed in general aswell. It's hard to determine if this is a texture issue or resolution issue. Outside of 3D rendered scenarios it seems to have decent clarity, sharpness? I found a few threads of people saying similar
Exactly as I described like the resolution is worse? Hence why 120% on destiny 2 seems to be on par with 100% on Nvidia. Hard to tell if it's resolution or Textures on models though. But the whole image seems blurrier as a whole in some games. Like some kind of Downscaling P
(P.S seems like this website suffers as bad as AMDs drivers, keeps saying I'm Post flooding and I've posted X amount of messages in X amount of time and won't allow me to post for X amount of time. It happens randomly and I have to come back like 1-8 hours later before I can post again. That's why it took me like 2 days to reply after I sent those pics)
@B00mBOXX, yeah that's not normal, I can already tell from the first link you referenced that it seems to be applying Radeon Super Resolution from a lower resolution in those Hunt Showdown images. I assume you know it is an in-driver force for Fidelity Super Resolution.
Therefore, I would start by playing with those settings and to force RSR off, etc. and try to force native resolutions. That definitely does not look like downsampling, but rather upscaling with side effects like DLSS1.
Ok I do not mean to post violent pictures (so sensitive viewers, please look away/past), but this is the best example I could find to show you how the resolution, etc. should look. The game is theHunter Call of the Wild:
@hitbm47 I even increased the one on amd by like 2% to show you just how bad the baseline is. Zoom in on the chest/Bullet pouch and Scarf area in general you will see how blurry it is on AMD. at 115%. I'll get some more pics later. Now NVidia is using HBAO+ but that wouldn't make the textures look that much worse. Also, in motion, AMD looks eve more dog **bleep**e. Shimmering and noise everywhere like we said before some kind of "Downscaling Combined with Upscaling" And more aliasing in general. Again, the shimmering is noticeable more so on transparent textures. There is a setting on Nvidia Control panel for AA transparency, and i always set it to 8X super sampling. It only works with MSAA i believe though, which this game doesn't use, so that shouldn't have an effect. Also, SSAO causes some textures to look grainy, and i can't force HBAO+ so that is probably contributing to some of the graininess, but not the extreme lack of sharpness on textures/general and aliasing on edges/transparent areas
@hitbm47 Keep in mind, it looks even worse in person. This is 1440P Native Resolution with 140% on the scale. Everything looks like a muddy/low res mess and just blends in together. Objects in the scene just seem to blend in because of the aliasing/low res/low textures etc. It looks like garbage lmao. On Nvidia i didn't even feel the need to up the resolution scale. Good clarity, and textures didn't look like play doh. I'll probably put my Nvidia card in later and compare more. But even with no comparison in this example. It's clear as day.
i'd love to see more comparisons, that destiny 2 one is INTERESTING for sure
@thanatoast Just replicating pics i already had from Nvidia, with AMD, havn't switched them out again yet.
Zoom in on the fine details especially the green area, and even look at the writing that says "veist" Nvidia just has much better clarity, no fuzziness/bluriness with lower resolution lol. Not counting the better shading from Forced HBAO+. The smaller pictures don't do even do the difference justice.
Like i said, i'll get more comparison pictures of an entire scene later, it's much worse on distant objects, combined with a seemingly way worse Lod BIAS on textures and shimmering/aliasing/noise everywhere.
Hey @B00mBOXX , ok but please post 100% (1 to 1) native resolutions, because otherwise it cannot be compared properly. Have you tried force disabling RSR, remember my RX 480 does not support RSR which is why I do not seem to be affected by this issue and therefore you can use the AMD Bug reporting tool in the Radeon settings to report the issue to AMD.
I went through the efforts of downloading APEX legends again and the game has improved a lot since I played it shortly after launch. It performs and looks superb on my overclock i7 870 and stock RX 480, except during the airplane jump it dips to the 40s, but as soon as I land, I have stable 75FPS at 1080p, but it took me 15-20minutes to find a match.
Here are my screenshots at 1920x1080 mostly high settings and graphical effects medium due to default preset:
Compressed to jpg 1:
Compressed to jpg 2:
Compressed to jpg 3:
EDIT: Unfortunately, I had to compress it to JPG to be able to upload it here, but everything looked in order on my system, the texture detail as well. The screenshots do not quite reflect what it actually looked like on my screen due to compression, etc. But, the Anisotropic filtering is working properly on those grids and the texture is not blurring out.
@hitbm47 I mean 100% res just looks disgusting, so it's no point. Lol. Did you zoom in on the Destiny images and see the obvious blur/fuzziness almost like it's lathered in a layer over the screen. Looks like performance mode DLSS on Nvidia almost. In motion it's even worse, aliasing and noise everywhere almost like zooming out of a high resolution picture (not digital zoom but true zoom) and the resulting decrease in pixel quality/clarity and increase in aliasing that you get.
How do i "force disable" RSR? It's disabled in my Control Panel.
Hey @B00mBOXX ,
No I have not zoomed in on the screenshots I took, since it defeats the purpose due to it already begin compressed to JPG for uploading it to this website. Also it doesn't even look the same quality when in fullscreen vs natively on my screen, and zooming in past native resolution will cause upscaling and logically reduce quality due to duplications of pixels.
I did notice the lathering on your screenshots though. Furthermore, I understand you will not be playing at 100% scaling, but we need to do it to properly compare for this test, otherwise our results will be invalid.
I do not experience the increased aliasing while moving around at all, this can be a result of Radeon Boost, or if your game is using temporal anti-aliasing.
Ok so RSR is disabled then, just then make sure to set it to disabled in your game profile as well instead of the "AMD Optimized" setting.
Also keep in mind that AMD and Nvidia will render and look different/better on some features. For example, I prefer AMD's HDAO over Nvidia's darker HBAO.
EDIT: Sorry I read over the "Destiny" part, and thought you were referring to APEX Legends.
@hitbm47 I'll try to get some 100% render pics, but if the render scale is higher and it still looks bad, something's wrong. Super Sampling tends to look better and never bad.
Ok , nice with the edit hahaha. I was like "I didn't tell you to zoom in on your own pics?" Lol. I was like **bleep**.
Like I said look in the chest, Belt, Scarf area, and on the guns look at the barrel and green area and writing. Not counting textures looking bad etc but everything just looks like it's not running at native res/aspect ratio. It's even worse on distant objects looks like Xbox 360 console graphics in some instances. Looks like a scaling/resolution issue to be exact. Like go zoom out, like actually zoom out not digital zoom but a true "zoom out" as In decreasing pixels of the image in question. Looks very similar to that.
Go download wallpaper engine, zoom out of a wallpaper In the "Free" screen fit mode, compare it with zoomed in.
Why does a screenshot from 2014 look better than my battlefield at 1440P, 140% Res scale. Lmao. 0 noise, the foreground, background etc don't blend in to one another like a mangled mess. This is on an AMD card btw
:Edit: Keep in mind it seems like a mixture of not running in native res/bad scaling/aspect ratio and actual LOD issues. In some games, Textures up close look kind of decent sometimes, but any sort of distance and they look disgusting. Which would hint at LOD issues
This is vastly simplifying it, but i'd say AMD (the entire image overall) looks like the right image and Nvidia looks like the left. Not counting the weird texture issues/lod/aliasing etc
Hi @B00mBOXX ,
excuse me for only replying now. I suppose the second image is the Radeon one? Because, I can see a lot of aliasing. But again, this is not helping, you need to post a 100% resolution scale image say at 1920x1080 so that I can take the same screenshot. Then if we can see my RX 480 looks better, you can submit the comparison images to AMD in the bug-report.
Furthermore, another thing you can try to do in the meantime is to do a clean install of only the drivers without the Radeon Settings Application just to test if it resolves the issue and if it is a bug with the Radeon Settings Application.
I understand completely what you are referring to, you do not need to convince me, personally, further. I am convinced it is related to AMD's new upscaling technology and the settings are getting messed up somewhere in their configuration.
I too found VSR looking slighty better than DSR even with 0% Smoothness and costs less FPS. But overall, VSR lacks customisation that DSR has and the image quality advantage VSR has was outdone by DLDSR.
I have 2 old, old games that I have played through generations of CPU's and GPU's.
AOE2 Conquerors and Firaxis Beyond the Sword
Both work and look better than ever with Adrenalin Edition 22.5.2 and running on Windows 11.
However; if you try to run or even install these games without first installing Net Framework 1 they are unlikely to run at all.
I don't remember where I picked up that info, but it worked.
im generally more concerned that basic per application graphic settings are not included in the minimal install and i have to use the full install for that and get bloated and spammed with amd link, performance metrics, recording and streaming and other BS i dont need and i dont want.
they should seriously just integrate per app graphic settings in the minimal installation.
I'm agree with amd driver is garbage. However I'm not fan of driver handled enhancer. Reshade or SweetFX or specific injection are better than driver handled.
bull**bleep** the AMD driver simply copies files to the hardware and applies values to them for the hardware.
the AMD driver lets you type in values like trillions of times nvidia's ultra quality and it just works
the reshade and sweet fx bull**bleep** trash you're referring to simply uses 8bit garbage in a single overlay filter thing.. it lowers performance greatly. the AMD driver runs with about the same performance loss but its altering the quality to be trillions of times higher when you tell it to by modifying the air and atmosphere and water and lighting and things.. with 1024 10bit values for the pixel display and like 64bit RGB values or higher.. not compressed 420 ycbcr garbage.. im talking uncompressed DSLR 10bit better than log photography quality gaming in real time.. take your reshades and shove them up your nvidia ports
This reviewer compared AMD vs Nvidia image quality and thinks AMD was better https://www.youtube.com/watch?v=R1IGWsllYEo
I have not had issues with AMD's graphics drivers with my 5700XT nor R9 390 before that. If you have Radeon software installed, you can set thing game by game or globally.
Gaming -> Global graphics -> Graphics -> Advanced
To compare graphics, you need to look at identical scene, not just same game as every game has better and worse textures all around.