cancel
Showing results for 
Search instead for 
Did you mean: 

PC Drivers & Software

B00mBOXX
Adept II

AMD has limited driver functionality and customization compared to Nvidia

I just got a 6950XT, recently coming from Nvidia, I had a 2080TI and the performance uplift is HUGE, that being said, even with the massively increased performance, I am extremely angered and agitated with AMDs incompetence. 

Old/some Games look absolute garbage, If a game doesn't natural support antistrophic filtering, ambient occlusion, Texture filtering etc, and it's not DX9 (even DX9 GAMES barely seem to work with these settings forced in CP, I swear they do nothing at all) nothing you can do about it, you have to play a disgusting looking sad exscuse for a game. On Nvidia you could Force, AF, AO (AMD doesn't even have that option), LOD BIAS, Texture filtering etc etc. On DX9,10, 11, ,12 Make any game look amazing. 

Is this a Joke? Is AMD trolling? I finally give them a chance and I can't even put AO in a game to make it not look like Dog Dirt, while Nvidia's been doing it for a decade. IT'S Almost like it's on purpose, extremely low power draw, compared to Nvidias top contender with around the same performance, half the price, yet doesn't come with basic functionality and driver freedom/customization like Nvidia. I knew it was too good to be true. 

 

  • Anyway to make these settings work correctly or force AO in games? Honestly contemplating taking this clown card back and buying one with basic driver/game customization. Alot of Games look like ass.com. I don't understand how you don't have these basic functionalities after a decade plus...
63 Replies

...are you serious? i can understand wanting more driver features for sure, but...you're crying because old games dont look as good as modern games, to the point you have to make a post on a support forum malding over it? are those games now rendered unplayable just because you cant force ao into them via driver? if you read this forum at all you'll see that radeon drivers have a LOT of problems that genuinely deserved to be pointed out, and that AMD absolutely deserve to be put on blast for...and yet your post still comes off as unreasonable and whiny as hell.

god help you as you get older and look in the mirror, brother.

if this is bait, you got me good

You are a clown. I guarantee my opinion is widely supported. If you can't look out for fine details/functions and neglect them completely (compared to your major competitor) as I described here, you don't deserve to be "a top brand". Even if it (AA, AF etc) may seem insignificant in a literal sense, to you, which it isn't, you're just a shill moron...it shows they don't really care, or put effort into user experience/freedom like Nvidia does.  Anyone with a brain or basic eye  for detail will notice how BAD certain games look compared to Nvidia once they switch over. These features are so basic and innate, you shouldn't be paying premium for a "top tier" "competitive" GPU and be stuck without them. It's unacceptable.

 

Visuals are one of the major factors that get people into PC Gaming in the first place, on top of that, if they'll slack on These tiny minor functionalities, they'll slack on anything...and Simply put, Consumers like The ability to customize and choose. My points are more than valid. 

my thoughts came from the fact that i have never once used any of those features, both on nvidia AND amd. genuinely never thought their use would be *that* widespread even in the worst case i could think of. guess i was wrong. my L to be sure, sorry about that.

the only feature i use that you could *maybe* lump into that group would be image sharpening, which is **bleep**ing great stuff. conversely there isnt a game where i *dont* use it. credit where its due,  amd implemented this first and its functionality in practice is better than nvidias, i've tried both. its not as simple as "amd bad hurr durr", theres a reason they still exist after all.

also...not a shill at all! i've had both brands. look at any of my previous posts on here and you'll see what i mean. amd's drivers are very, very terrible under all that veneer. i've had my vega 64 since 2018 and there are specific bugs related to that card that STILL exist in 2022. new bugs every driver update. im on a driver from february because a couple of the newer ones wouldnt even install, it doesnt end. radeon need to do MUCH better.

 

0 Likes

Yeah if you have an Nvidia GPU, go compare an older game (even newer games tbh) with Quality Ambient Occlusion, High Quality Texture filtering, Anisotropic Filtering, Etc etc you will literally be dumb struck, games like Left 4 Dead 2, Half Life, GMOD, ETc and even MODERN games with bad implementation of these features look absolutely amazing. Left 4 Dead 2 for example. Looks like a modern remaster lmao, (not counting the old animations etc). Thank me later. They also work in every Direct X Version and Open GL aswell.

Meanwhile on AMD these only function in DX9, and that barely works as it is. It's the principle aswell It's just straight up laziness, even if these settings weren't used, it's always better to have the choice. Just shows lack of care. Can't be too hard to implement these things like Nvidia and if you're going to implement them HAVE THEM WORK IN THE FIRST PLACE. 

 

Seriously though. Go on an Nvidia Card Put those settings on and watch every game look 5X better. Compare the two. Then put your AMD card back In and try to cope with the extreme loss in image quality (especially older games), that's where I'm coming from lmao.  I would say do it on AMD but it's too unreliable and bad obviously.

Also look at a game like Apex legends with no anisotropic filtering 

apex defo has texture filtering. wouldnt shock me in the least if it got broken though. 

0 Likes

Not for me, set to 16X in the settings, tried everything. Check ArrowsNot for me, set to 16X in the settings, tried everything. Check Arrows

0 Likes

Even went to game config files to try and mess around and fix it there, no dice. Literally worked fine like 3 days ago when i had my 2080TI. So it has to be Some kind of AMD bug. 

0 Likes

do you have the AA/AF stuff in radeon settings turned on globally? if so, it might be messing with it. 

honestly even if you have that stuff turned off, it might not have applied properly. certified radeon moment. can definitely say texture filtering works on my end though.

NICE FIND. I thought about trying it without it but i thought to myself no way AMD's drivers are so bad that they would break The game like this, it'll just Use the in game option if it can't apply it to the DX11 Application. I was wrong. AMD drivers are THAT bad.NICE FIND. I thought about trying it without it but i thought to myself no way AMD's drivers are so bad that they would break The game like this, it'll just Use the in game option if it can't apply it to the DX11 Application. I was wrong. AMD drivers are THAT bad.

0 Likes

Hey @B00mBOXX ,

The Integer Scaling is under the display tab in Radeon Settings, you also have to enable GPU Scaling in conjunction with Integer Scaling, but it will cause black borders in most scenarios.

Personally, I find that 50% of games look better on Radeon than Nvidia and then vice-versa for the other 50%, and in these cases the Radeon Image has better clarity to me, but I have found shadows and and sometimes anti-aliasing to be more complete on Nvidia in some games.

It is difficult to pin-point the issue, as like I said, sometimes it is due to the way Windows 10 runs everything in Borderless Fullscreen, but pretends to the game that it has exclusive fullscreen rights, so I put some blame on Microsoft as well.

Furthermore, I have also found that; for example, in hellgate london's steam release the in-game anti-aliasing doesn't work on my RX 480, whereas it does on Nvidia, but on my RX 480 I have to disable AA in-game and then enable AA in-driver and likely with the Fullscreen Optimizations disabled for the game.

Back to the GPU scaling scenario, make sure you do not tick the "Center Scaling" option as this will use 2 VSync intervals (when VSync is enabled in-game, although it is a very smooth 30FPS). Otherwise, I do not experience any extra input-lag. Furthermore, on a relatively good Monitor, upscaling looks just as good as the GPU, if not sometimes better. The reason to select "maintain aspect ratio" is for 4:3 resolutions such as 1024x768 do display with black borders so that the image does not get stretched to full-screen.

I see what you mean. The way the entire image is rendered seems different to me. Like if Nvidia and AMD had the same in game settings and Control panel settings, The entirety of the image in it's base form is different. AMD looks sharper to me (All other settings equal but that isn't the case with most titles,  in games where you can't brute force better AO/AF/Texture filtering it can be lackluster) and also like the Mesh/Polgyons and geometry are rendered differently. 

Yeah i did some research and it's know that depending on your gpu/monitor either can be slower and or ineffective causing either input lag or worse picture quality. I knew i wasn't imagining things https://www.technewstoday.com/perform-scaling-on-gpu-or-display. That's a whole other set of issues i'm dealing with aswell. My sensitivity/input lag change almost constantly based on what program my mouse is currently hovering/open over. It's getting quite old. 

0 Likes

Nah, their drivers are fine. Software though.... yeah it's pretty bad.

Delete Adrenaline and install the driver only you'll run into a ton less issues.

14900k • Z790 Apex Encore • 4090 • 2x24Gb-8000 • 1200 Corsair Shift • 420mm Arctic II Push-Pull • Noctua • Fractal Torrent • Guilded.gg/justifiers
0 Likes

Gaming graphics!

Was this a bundled game with your Nvidia GPU.  Did it ever occur to you that Nvidia has incorporated code into that game to make it not work well with AMD GPU's.  If not, open your eyes.  AMD is just as guilty.  They do this for a reason; to get someone like you to b itch and bellyache about how bad the oppositions product is.

How many programming languages are you proficient in?  Have you ever written code, of any kind.  Are you running the AMD card on an Intel motherboard chipset.  I run AMD throughout my systems, and I don't experience a lot of the problems that I see on this blog.  I honestly think that a lot of the problems are self inflicted.  AMD and Nvidia run on different engines.  Take an open source graphics test, where neither card has the advantage, and compare the two before you start bashing one company or the other.  In other words, show the proof from a reliable test.

If it ain't broke; don't fix it!
0 Likes

I felt the same way. As more people bring the mess from nvidia forums i see people go from toxic to understanding people actually want to help here and all they need to do is spend time trying all the settings or watch something that explains them instead of reviews of just slide all the sliders to the right. I been on both forums and i see the difference. Yes AMD has problems but they give you the tools to find a solution and tinker, which i personally think is fun, then if it dont work you submit a trouble ticket and wait lol but it dont "just work" everytime. With that being said i havent had any problems like i see on the forum in the past years. 6 years ago it was way worse but people used this place to let others know what works and what doesnt, tips, troubleshooting to make things better and not to say this sucks and im buying the competition next time because they #1. I have seen increasing post like these where it just says this product sucks and the competition is better and thats the majority of the post without saying whats wrong or game information or anything leading me to believe the is more trolls and bots showing up here and people unwilling to learn the different ecosystem they now invested in. Takes time and patience. Take into consideration lots games have problems especially older ones (crysis, deus ex) and even popular older ones (fortnite, yes its old). Some games were unnecessarily stuffed with issues to make it seem like you cant run this game so you need to buy a beefier card also.  So all im saying is people should learn the plethora of settings and ask the community before saying garbage. I seen two great suggestions and no 50% help and 50% agreeing that they should just get a different card. Understand we help each other here first not trash talk and anger.

this is true, kneejerk reactions help no-one. but i also understand in this case where the user expected feature-parity with nvidia upon switching, and was disappointed.

personally i definitely get lost in the sauce sometimes when it comes to **bleep**ting on these drivers for all their bugs. there are a lot. many of them have stuck around for way too long...LIKE THIS ONE (i had to, this one is unbelievably bad) https://community.amd.com/t5/drivers-software/wattman-still-randomly-applies-a-cpu-overclock/m-p/528....
but i cant really ever say i've been disappointed with the feature-set available, theres a lot of good stuff like i mentioned before, just gotta get through all the taint surrounding it.

i can also say personally, from my previous gpu (gtx 780 from 2013-2018) that nvidia's drivers are far from perfect. aside from the lacking performance of that card as the years went on, one of the main reasons i upgraded was to get away from some of the driver related issues. nier automata was and still is completely broken on those cards, black ops 4 did not work at all, all on a card that was ONLY 2 gens old. my vega is also now 2 gens old - nearly 3 (nearly 4 if you count GCN 5.1) - and is absolutely ripping. zero issues with game compatability so far.

its not as black and white as we (myself included) sometimes make it out to be in anger. doesnt mean i cant wish for better though.

 

 

Perfectly said. Yea i love Nier Automata very much but my god was the graphics and stuff bad. I had to download a reshade lol

EFermi
Miniboss

I so understand you bro. I also used to run a 2080Ti Asus Strix, and then I changed it for 6900XT because it cost like 25-30% cheaper than 3080 at the time due to mining craze. I'mma so switch for 4080 any day after launch it's available in local stores. There's no proper override for anything, it's a mangled mess under the fancy trunk. It was my first AMD card, and it will be my last. Better pay 30% premium than use Radeon. 

hitbm47
Forerunner

@thanatoast no he has a good point, AMD has a tendancy of not fixing things and only focussing on Frame-rate results in adverts of new games.

@B00mBOXX I wrote a post on this forum probably about three weeks ago explaining how to get in-driver anti-aliasing working for DirectX9 games and it will be the same for anisotropic filtering. I don't now about the ambient occlusion you are abbreviating, my GTX 1060 3GB also does not have an option for that.

It basically comes down to the fact that you have to disable "Fullscreen Optimizations" in the properties of your affected game. This is the case on my GTX 1060 as well, and the mistake is more related to Microsoft's forcing of Borderless Windowed and letting games think they are running in Fullscreen.

Also make sure to choose the override options in the driver and to disable the corresponding settings in-game (noticed I had to do this for hellgate london) and then restart the game.

I'll see if I can share my post here later, but it also works for some DX8 games and sometimes DX7 too. I don't think it will work for OpenGL games.

EDIT: In conjunction to this, you can enable Integer Scaling to ensure the game runs in native resolution to prevent the "foggy" picture which upscaling produces, or you can enable virtual super resolution to see if you can run the game at higher resolutions than your monitor.

Kind regards

My god, thanks for this, ON the last part. Is integer scaling, GPU scaling in the Radeon options? I thought I was imagining things but I've tested it over and over and Turning on GPU scaling (with aspect ratio intact) makes every game look much better, even if it claims it's already running at 1440P? I notice a significant amount of input latency with this option though, compared to off. Can't ever win, always some trade off. 

 

I thought Display Scaling only reduces image quality if a game isn't running at native aspect ratio? Seems to reduce image quality regardless 

0 Likes
jSoad
Adept II

Welcome to AMD buddy. This is just the top of the iceberg. I could stay here all day typing all kinds of issues far more worse that AMD cards have, specially RDNA and RDNA2.
Like you, i'm waiting for Nvidia RTX 4000 launch to get rid of this disfunctional card of mine.

@jSoad, this is why I am afraid to spend even more money to upgrade to newer AMD hardware. This makes me wonder if the consoles have completely different driver developers even though they are using AMD hardware.

Because, we have barely experienced performance issues on the XBOX One, PS4 base models which all have AMD hardware. I believe AMD's hardware is great, but there is a clear difference in driver performance on console vs Radeon driver performance on PC, I mean in the current state my RX 480 is getting worse minimum FPS drops in Unreal Tournament 3 than the ATI Radeon in the XBOX 360 got, which indicates to me that there must be more caring teams working on the console drivers.

Furthermore, there were even emulators/wrappers released on XBOX One to patch and play XBOX 360 games on the modern-ish AMD hardware and still does it better in some games than a FX 8350 & RX 480 can manage of Unreal Engine 3 (DirectX9 version).

And the XBOX One had something similar to an HD7790 with 8 jaguar cores clocked at like 1.7Ghz. But, I have noticed the Nvidia driver on PC is less unbiased in DirectX11 games than AMD Radeon; for example, if I use either my FX 8350 or i7 870 with my GTX 1060 it will scale to all eight threads, whereas I found on my RX 480 it is 50/50 on which games it will use all eight threads between the FX 8350 or i7 870 (even when I overclock my i7 to 3.9Ghz).

honestly nvidia compared to amd seem to do almost everything better, other then power consumption lol. nvidia seem to have far superiour settings with geforce experience, i was trying to test my input lag in game the other day just to find out that amd doesnt track it at all. also, zero amd antilag compared to nvidia's anti lag software ( cant remember the name) is much better. hardly notice the difference on amd whereas on a 1050 with a nvidia card the difference is very noticable. i feel like they need to stop pushing useless updates, focus on fixing the current issues, and maybe giving features in amd adrenaline that make it worth using. i personally am a fan of amd but you have to give credit where credit is due, nvidia just do it better, in most aspects.

@ttocchiI mostly agree, then again there is also the scenario where game devs primarily develop for Nvidia hardware and just makes sure that it works on Radeon. For example, most of the time when I submit support tickets to Ubisoft they ignore the fact that my RX 480 8GB is a stronger card than my GTX 1060 3GB and tell me to use the GTX 1060 instead of deferring the issue to the Ubisoft developers so that they can attempt to improve utilization.

0 Likes

that would mostly be due to it only being a small fraction better whilst the 1060 is the most used card on the gaming market overall, with steam stats etc. the optimizatiojn for that card outdoes any other. hence why they would suggest it. honestly the 8gb vram means nothing on that card. when in alot of cases you would actually probably benifit from using the 1060, and thats purely just becuase of the optimization for those nvidia cards.  the only time you would really see a difference in fps is when on dx12 and using over 3gb vram. personally i think switching to the 1060 would be the better option unless you need more then the 3gb vram.

0 Likes

@ttocchi  Actually the 8GB RX 480 is mostly stronger than the 3GB GTX 1060. The 3GB GTX 1060 has quite a few less CUDA cores than the 6GB GTX 1060. I initially bought the GTX 1060 refurbished to use it for dedicated PhysX and works very well in Killing Floor 2.

Furthermore, the 8GB of VRAM actually makes a massive difference in games that use more than 3GB of VRAM, I have experienced this happening on the GTX 1060 where it would hit 100% utilization with low wattage (utilized by other tasks than graphics processing) when the VRAM is maxing out and causes huge FPS drops. For example, Forza Horizon 4 runs ~40FPS on the GTX 1060 due to the VRAM limitation, whereas my RX 480 does 70FPS and higher. I would play on the GTX 1060 more regularly, but surprisingly the 3GB VRAM bottlenecks quite often and I wish that I rather bought the considerably more expensive 6GB variant.

I do agree with you that unfortunately games are more optimized for Nvidia in general and not specifically for the GTX 1060, but I must also mention that many multiplayer games run well on the RX 480, and I do actually prefer using the RX 480 for e-sports since I personally experience quicker response on the RX 480.

Furthermore, simply recommending to use the nvidia by the developers is also very poor attitude, because it is unacceptable of them to just optimize for nvidia since it is not an exclusive market and when they have clients reporting serious performance issue to them. They should rather focus on optimizing for a general standard, then you get good results across the board like in Battlefield 3-4, Resident Evil 4-7, Metro Last Light, etc.

0 Likes

i get that, i guess ubiusoft is just a bit weird, try disabling game mode in the windows settings, i disabled it and got like a 40-60fps increase on my 6600xt. something about amd cards seems to make most of them work better without gamemode from experience. i had an r7 240 a while back, and disableing game mode too games like overwatch to like 90fps on that thing at 1080. the vram bottleneck is only gonna get worse though as time goes on. hence why nvidia is trying to make this 1630 card or whatever for people to shift too. hope you find a solution though, id honestly just recomend going to youtube and watching video and testing other peoples methods, hopefully you find one that works. 

@ttocchithat's interesting, thanks for the tip. I have tried to disable game mode recently but it did not improve my specific performance issues, but maybe I'll try it for Assassin's Creed Origins. That game runs horrible on a RX 480 with low utilization.

Other people on youtube are also experiencing the same problems as me on Unreal Engine 3 (DirectX 9 games) and I also wrote a post to AMD explaining that it started directly with the release of 17.7.2 where they very likely lost optimizations, since 17.7.1 was the last driver to contain "Radeon Additional Settings", where a lot of Catalyst optimizations were kept. Unreal Tournament 3 drops to 28FPS in areas where it used to do 70FPS.

But, I have to say good night for now, have a good weekend people.

0 Likes

@ttocchi Man, I'm at a major crossroad here. My 2080TI Consumed 350-70 Watts maxed out, This 6950XT consumes about 400W Maxed with barely any undervolt and in most games I'm at 320-330W at 2700 Clock and 2350 memory, with a 144 HZ cap.

Much More than double the performance of the 2080TI In most games (with Smart Access Memory aswell)  with the same or even much lower power consumption...I even notice less "raw" input lag, but the lack of Support and customization for the software/driver side is really bothering me. 

At the same time it's going to be hard going back to Nvidia, paying 1000$ more dollars and having my legs burn off from the heat for around the same or EVEN LESS rasterization performance. A true dilemma. I have around 20-27 days to decide to return this or keep it.

They seem to be taking their driver/software side more seriously lately with the boost to DX11, MAYBE THEY fix the other things later too?

the dx11 thing is pretty sick but realistically, should never have been needed in the first place if you think about it.

i've had my vega since late 2018 and in terms of the card itself, its done everything i've asked of it. runs everything i throw at it, tons of room to tweak and mess with it, fun to work on...but the drivers are just all over the place.

for every genuinely good feature, theres like three bugs to go along with it. the fact that you have wattman as a built-in overclocking tool is AMAZING...when it works. radeon image sharpening is GREAT...if it actually applies (used to be much worse). ReLive is MUCH better than shadowplay in my experience...when it actually decides to work. its been like this forever. the only difference is the UI.

if you do end up keeping your card, find a driver that works (well enough) and stick to that driver for as long as you can, until theres a new feature that you deem worthy of risking updating for.

@B00mBOXXwhat I can say is that AMD actually seems to care a bit more for those Navi 6000 series cards than they did/do for us Polaris users in terms of driver compatibility.

Furthermore, AMD is always going to be a bit more technical than using a Nvidia, but when it actually works well in a scenario I prefer using it over a Nvidia; in contradiction, a Nvidia works well more often.

Personally, I think AMD's User Interface is better at this stage than Nvidia, the major bother to me is there driver performance in 50% of the scenarios and there unwillingness to add features or notice bug reports; for example, they still do not have a Half-refresh rate VSync option for fixed refresh displays, and the Chill or FRTC options are not sufficient since FRTC does not work with VSync and Chill stutters when set to half the refresh rate.

@B00mBOXX  two more highlights I might give you in your decision towards AMD:

- Personally, Virtual Super Resolution looks a bit better than Nvidia's Dynamic Super Resolution.

- Open Source seems to be picking up on AMD, old R9 200/R9 300 users can use relatively up to date drivers from the modded NimeZ drivers. In addition, OpenGL is apparently performing acceptable on Linux for AMD and DXVK also brings some improvements for AMD, I think you can possibly force Anti-Aliasing options in the DXVK wrapper's config file.

@hitbm47 I hate using the Dynamic resolutions  it just looks inferior to Super Sampling and Native, almost an artificial, grainy look. I Wish actual Super Sampling could be applied to any games, instead of being a developer dependent addition that some games have and others don't.

Guess there's something preventing this from being forced from the driver side? Seems like it'd make sense to allow that by now. 

 

Yeah, it really just bothers me some older games (and even MODERN)  just look so much worse than Nvidia, the contrast was stark and I have no freedom to change it. Ambient Occlusion and Good Texture filtering are a must. Even AMDs natural LOD Bias seems to be inferior in certain games.  Nvidia spoiled me with it's brute forcing/reliability. 

Do you think they'll ever remedy these things? Or is it a lost cause. I wonder are they just too lazy/uncaring or is it some underlying limitation and they'd have to really dissect the software and architecture to fix? 

I dont see them addressing/expanding/adding those features any time soon, if at all.

CERTAINLY not Ambient Occlusion.

I have to ask though at the risk of sounding like a **bleep** like I did earlier, but does it matter how an older game looks?
Surely forcing something like AO through a driver has caused less than desirable results? Similar to doing the same thing through reshade for example, where it only has access to the depth-buffer and therefore the AO can be seen through transparencies and looks terrible sometimes. And what modern game needs AF and AO forced into them? What games truly look that different on AMD vs Nvidia? I'm just not seeing it.

EDIT: I say this because I've seen people say the opposite in the past; that games on AMD cards look better for a number of reasons. Not saying one or the other is true, just that things like this crop up all the time that dont seem to have any actual evidence behind them, just conjecture.

@thanatoast Visual quality doesn't seem to matter to you, you are in the minority, it's not conjecture it's simple fact. I don't think you really know what you're saying, In regards to driver forced graphical settings, I'm sure Nvidia knows what they are doing, since they are the  "#1 graphics" manufacturer and games are literally designed to run on their hardware.  Comparing them to some open source "fan" funded software like reshade is illogical.

Also, most of The AMD looks better I've seen is in relation to color compression. Not textures or other graphical algorithms/procedures.

I've been using Nvidia for 11 plus years, my perception and brain has been accustomed to the visual "bias" of Nvidia, so when switching to something "inferior" in certain cases, it's going to be noticable. I'm not biased towards Nvidia,  I wanted AMD to be better in every aspect, you think I want to go back to Having my leg hairs singed and paying 2000 dollars for similar performance? No. 

Again, go put an Nvidia card in, Force AF, AO, Texture filtering Etc etc in a game like left 4 Dead 2, or even modern games that have bad implementation of these things. You will understand immediately. Tomb raider for example, just looks bad, the textures just aren't sharp on this card compared to Nvidia. Even with better in game settings lmao.

It's also not simply about the graphics, it's the fact that Nvidia allows you to CHOOSE and it's consistent, you can implement all of these features in any API (for the most part, there are exceptions), While AMD claims DX9 only and it's hit or miss with everything, even some DX11 games seem to be effected. If I think a game looks **bleep**, I should be able to force graphical settings like Nvidia has always allowed. 

You come off as the type that will go into config and put your lod at +8 to get better performance, regardless of how the game looks, I feel like when you had Nvidia you never even touched the control panel settings haha. that's not me. I want both, and that's what I should get if I pay a premium

0 Likes

I'm really, GENUINELY just trying to understand your point of view. Not trying to attack you or say you're wrong. I just kinda dont understand where your coming from. And I would say that MOST people do not care about forcing stuff like this at a driver level. Certainly nobody I know, anyway.

You brought up L4D2 specifically. Played it on 360 when I was younger. Played my fair share on PC. Its a source game from over a decade ago. To me, it looks as it should for that age and engine. Doesn't come with AO natively, this much I know. Can see why you'd want to force it via driver. I never did though, because I just...dont care? I dont know how to properly word it without it seeming insanely simple but like, I'm not gonna get twisted up over how L4D2 DOESNT have AO or something like that. A popular case I saw for the AO stuff was CSGO, where you could force it on and see the AO through smokes...which was cheating.

I'd rather just play the game. But saying that I dont care about graphics is EXTREMELY untrue. 

Either way, you're starting to sound like a right snob again. Chill out a bit, yeah? None of this is that serious.

 

"You come off as the type that will go into config and put your lod at +8 to get better performance, regardless of how the game looks" also what the **bleep** does this even mean lmfao, jesus christ brother. (please dont explain i really dont give a **bleep**)

0 Likes

@thanatoast Let's not forget you verbally came at me first, attacking me like a with your first post, Because you simply lacked the understanding of my situation/complaints. 

Again, like I said it's not just the graphics that I care about, it is the principle, if I'm paying premium money for a so called direct competitor I want all of the features of the competitor and I want it to be reliable. That's what I'm paying for. Premium money for premium product

Also, let me give an example, imagine you went from a 3080 to a 1060, the stark difference and decline in performance would be extremely noticable, would it not? While let's another example, say you've never experienced a 3080 and your best was a 1060, all you'd know is that the 1060 is the best performance you've experienced, you don't have anything better to compare it to. This example is relevant to so many things in life.

Now, I've been forcing these settings in Nvidia control panel for a decade, I come to AMD and these settings are unreliable and barely work. (Like was said above though, perhaps it's some driver bug, because close up textures look so much worse on most games than Nvidia, like the LOD is bugged, I don't think texture filtering quality would have that huge of an impact) so the baseline is Nvidia, the Huge drop/contrast in visual quality is extremely noticable and stark because I've experienced better beforehand. So it's noticable. In your case, if you've never experienced better, then all you know is how AMD looks. That's all you have to compare. That's the "height" of your experience. 

 

0 Likes

Come on guys it is really not necessary to curse and slander in your comments.

I do agree that the current options in the Radeon settings should be improved to work at least in all DX8-DX11 games.

Furthermore, ambient occlusion would be cool, but that is a bit excessive in the sense that you are starting to add graphics to a scene instead of smoothing out existing rough edges. Then one might rather consider using reshade.

0 Likes

*Attacking me like a **bleep**ing clown

0 Likes

how was i attacking you in that previous comment...at all? lets forget that though.

long and short of it, amd wont be addressing this any time soon, if they're only just fixing dx11 on rdna2 with rdna3 right around the bend? i just dont see it happening. those settings have been untouched for a really long time, and i dont see it changing.

0 Likes