cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

CrossFire vs SLI – Are They Worth It?

amd crossfire nvidia sli

Getting two or more graphics cards to do one job sounds like a great idea in theory but in reality, it may not yield such great results.

There have been a host of issues that plagued both AMD’s Crossfire and Nvidia’s SLI, both valid approaches to the multi-GPU idea, but neither can really brag about solving their issues properly.

When these technologies made their debuts in the mid-2000s, many were excited about the implicit promises they made. However, the lukewarm start both of these companies offered for their respective multi-GPU solution would, unfortunately, continue throughout their entire lifespans.

Crossfire was officially put to sleep in 2017 and while SLI still technically has a pulse, it’s been replaced by NVLink or simply common sense, depending on who you ask. Even though Nvidia’s SLI upgrade does technically offer a massive improvement, one could argue that previous generations soured the tech world on the idea.

0 Likes
8 Replies

I have a GTX 690 in the junk box, so a pair of them would be quad-SLI

back when the card was more of a thing than now it was great for playing games of the era

so now I use high end cards and make do with one card

0 Likes

Disagree with their conclusion. Since DirectX 12 and Vulkan killed Crossfire and SLI, instead making multiple GPU a feature of their APIs requiring explicit developer support, games which even work properly with multiple GPUs have dramatically decreased. To quote TomsHardware's article citing  Uniko's Hardware's article on the 5700XT + 5600XT mGPU test:

So their conclusion of

Is misleading. GamersNexus's conclusion from their 2080Ti SLI test

And going farther, since the original article talks about 4K, I'll link Techpowerup's chart

And part of their conclusion

I use an nVidia example exactly because of what they state in their conclusion: nVidia developer relations are solid, have been for years with the TWIMTBP program, so developers in their pockets are more likely to support SLI/mGPU than Crossfire.

So just to clarify for anyone who may not be technologically adept who comes across this thread...

DO NOT USE CROSSFIRE!!!!

0 Likes

Not knocking you or anything, just clarifying and giving a TL;DR for the surprisingly large number of people who seem to think buying 2 RX 580s is worth it.

0 Likes

a pair of RX 480/580 is not solving my problem of gaming at 3840x2160 which is

the job for the RTX 2080 which has more performance for this kind of job.

I will wait and see what surfaces if and when it does.

SLI is limited to the very high end of NVIDIA's hardware

0 Likes

No worries I get what you are saying. I think it did a pretty good job of dispelling the benefits before that though too. I like articles like this because too many people waste their money and time thinking it is a valid path. 

I especially liked how they pointed out AMD pretty much dropped support as many of us have said since 2017 when the Wattman drivers came out. Yet they still claim support. The current drivers don't even have the profiles according to a couple reports in the forums. I think Ray was passing that along to engineering. There are still however several games that if you have the money to burn support SLI or nvlink but still it just isn't worth the headaches it brings in other areas. 

0 Likes

The real "problem" is the real possibility that  nVidia's Ampere based replacement for the 2070 Super will have the same power as the 2080 Ti. It would only have to be 25-30% faster to match that speed, and considering both the node shrink as well the architecture revision from first generation Turing, that's not anywhere out of the realm of reason. The same thing applies to AMD's RDNA2 based card which will replace the 5700XT, and AMD is really touting those kind of performance improvements when they tease out information about it, so the new baseline by the end of the year is that a card costing $500 or less (much less if AMD's claim of lower prices holds water) is going to be a solid 1920x1080 144fps, 2560x1440 75fps, and 4k60 card. If the same performance improvement figures apply to the 2080 Ti's replacement, say it's 30% faster and say it retails for about double with the (presumed) 3070 retails for, you're not going to have the argument for "use 2 inexpensive cards to match the performance of an expensive one" argument because of imperfect scaling and cost, as well as the inherent issues of using multiple GPUs, such as microstuttering. 

Another big drawback to multiple GPUs since the beginning is that lower end cards are always handicapped in some way compared to the higher end ones, be it memory interface, capacity, overall bandwidth, backend cuts, etc..., and that's really apparent on the Navi product stack. This isn't AS MUCH of a problem when you're talking 1920x1080, but at 4K you're going to need the bandwidth and capacity else those frames physically aren't going to be able to be generated as the memory just isn't there to store them. It's pretty logical to assume mid range next generation cards will at least have a 11/12GB VRAM model given their power, and assuming mGPU technology is implemented perfectly and that 6gb of VRAM of a lower end card scales perfectly to 6+6=12GB shared framebuffer the way it's supposed to, it's not going to be the 384/512 bit interface of the higher end model, it's going to be handicapped to 196/256 bit which is going to handicap your performance already on top of imperfect scaling and microstuttering.

And this is where Radeon Image Sharpening and DLSS really come into play. nVidia is really pushing DLSS 2.0, as they should as the way they describe it it's much more accessible to game developers and should drastically increase adoption. AMD has a version of it in their developer tools which should be a true successor to RIS and match DLSS' functionality. Yes you may lose some image quality, but if you're targeting ultra high refresh rates, 144hz and 240hz, at 4K resolution, software solutions such as these really are the key, and realistically aside from the e-peen factor of showing off a benchmark result, given how the base quality of games has increased with the inherent texture resolution increases and such, the details which may be cut to dramatically increase FPS may not even be that noticeable, such as shadows and reflections.

I hope that makes sense, typing this after a bad night's 3 hours sleep and a long day.

Thanks what you said is spot on to me. If a 3070 is similar to a 2080 ti and I can get one at 500 bucks, I would be a buyer. DLSS when it first arrived was not good, but in reality they fixed it really fast. I use it in all the game I have that support it. DLSS and RTX even on my lowly 2060 are worth using. On my 2070 super they shine. I hope that more cards exceed the 8gb point but fear AMD will be far more likely to provide this than Nvidia who has been stingy at times on the ram. I hope RDNA2 is a powerful competitor, however it has to have good drivers too and see that being the big question everyone will ask. 

0 Likes

True with the 580's. You would be better to sell the one you have and add that to your next card purchase. I personally think now would be a good time to start unloading GCN cards. I don't would bet if Big Navi is even halfway successful that I would be surprised if GCN support is not dropped a by a year after RDNA2 drops. 

0 Likes