I have just got 2 Vega 64 cards............when will Crossfire be supported?
Ok, just got a response from AMD.
"Thank you for the email.
I understand you are not having Crossfire option in Radeon settings with 2 Radeon RX Vega 64 cards.
Our driver team is currently working on it and this will be in future driver releases. As of now Crossfire option is not available for Vega cards. I do not have any information on ETA. Please subscribe to the below link to get notified about upcoming technology announcements:
Thanks for contacting AMD"
But.. multi-GPU and Crossfire should work now.
It should? I have no Crossfire option in the Settings? Version 17.8.1
Have you read this? Radeon™ Software Help Center | AMD
This link says the Crossfire is under "Global Settings" : AMD Radeon Vega Frontier Edition CrossFire Testing | PC Perspective
"Installing and enabling CrossFire with our Radeon Vega Frontier Edition hardware was a simple as would expect. The current driver from AMD’s website was used, and in both the Game Mode and the Professional Mode, the CrossFire option exists under the Global Settings."
A query for amdmatt or @ray_m
This is interesting.
AMD Distancing from CrossFire with RX Vega | GamersNexus - Gaming PC Builds & Hardware Benchmarks
Yeah I am wondering the same thing. I would understand if the crossfire scaling performance was bad, but not having the option available at all is disappointing. One of my VEGA 64 is just gathering dust right now...
Yeh, I saw that......but the Frontier Driver doesn't work with the RX Vega. And the Crossfire option is not there in the RX Vega driver.
Just wondering...but why would you buy two of the cards for X-Fire before discovering whether or not X-fire would be supported on Vega? But, maybe you can use the other card in another machine--or refund it, etc. Another advantage to waiting for support besides the support itself *cough* would be that you might pay a lower price for the second card a few months after Vega hist mainstream quantities...Just thinking out loud.
As D3d12 directly supports X-fire/SLI in the API itself this would seem a strange time to think about dumping X-fire support, I should think. Right now X-Fire support requires custom IHV support in the drivers to obtain it for the all the games that now support it, but in another few years X-Fire should find its way into d3d12 and higher game engines, alleviating the need for custom driver support moving ahead. But all of that is theoretical as of now...;) As individual cards become more capable and better performers, though, I'd imagine the overall demand for X-fire will continue to drop, however.
My reasoning for buying 2 cards was simply due to shortage.....nothing more. I was able to obtain 2 (which was extremely lucky), so I ordered 2.
I am fully aware of how new these GPU's are and am happy to wait for maturity of drivers......my question was as to whether Crossfire will be supported or not? As I have heard the same rumour about AMD dumping Crossfire, which would be silly.
If it won't be supported for RX Vega, I certainly have a use for the 2nd card.....don't worry about that. Cheers.
Seems like I heard similar comments last year when Nvidia introduced the GTX 1060. SLI/Crossfire market share should keep on slipping as graphics cards in general get more powerful. There are legitimate reasons to have/need two graphics cards....but for 99.9% of the people in the world...it's a waste of money and crossfire/SLI have always had issues. If you took all the post in the driver forum over the years, the largest percent would concern crossfire.
Every manufacturer has listed crossfire support + it even says so on the bloody box mate.
I have attached a screenshot from the MSI site for their Vega 64 Black edition.
It's great that you know how to read and use MS Paint.........I too can read and had done so prior to posting my question.
Just because it's listed on the MSI website and the Box does not make it an option in the Radeon Software. MSI don't make the drivers and Software.
And before you say something about my configuration, I tried the Beta Mining drivers where the option for Crossfire is available. It's the current Vega software that does not have the Crossfire option.
The Performance/Power draw on Vega 64 is looking too high at the moment which is possibly why Crossfire DX11 has been "de-emphasised" and comments I read about developers not running MultiGPU DX12 before final RX Vega launch.
Testing on Prey has shown that running a single RX Vega 64 Liquid Edition in Turbo mode pulls 569Watts of power, based on the limited testing time available to the reviewer. The reviewer ran the testing at 2K Resolution because he does not feel a single card is powerful enough enough to run at 4K.
The review is here:
The RX Vega Liquid Edition Review - YouTube
You can probably double that number as a rough estimate as to how much two RX Vegas in Crossfire at 2K resolution could take provided the game scales well in Crossfire. The CPU use will likely increase as well, assuming you had a CPU powerful enough, with enough threads.
At a first guestimate running two RX Vega on Prey in Crossfire at 2K would likely pull 1.14 kW of power.You would need to have at the absolute minimum a single Corsair AX1200i 1200W power supply but to be honest that seems too close to the edge for me. You would probably need to go out and find a new 1500W power supply to be safe.
Prey was demoed running in Crossfire at Computex though so ... clearly this sort of power draw is fine, and Crossfire DX11 is not dead.
More info here:
Maybe the power draw of the GPU reduces at 4K? I do not know. You could always try to limit frame rate in FRTC I guess.
There is always Radeon Chill as well to save some more power but it currently shows problems on the Witcher 3 in Crossfire for me, and to be fair, Chill Crossfire Support has just arrived.
I can understand that no new games are supposed to be coming out using DX11, Prey may well be the only new game on DX11 and all new AAA games may well be on DX12 and Vulkan in the future.
However from what I have heard, MultiGPU DX12/Vulkan will have to work moving forward, as many smaller GPUs connected together on a single substrate is needed because of low yield of large GPU on newer low geometry processes and I think Navi "Scalabilty" is referring to this on the RoadMap.
Hope this helps.
Thanks for you in depth reply. That was extremely helpful :-)
That's some serious power draw......I have a 1000watt Corsair which certainly won't cut it if really ends up being that high.
I'm hoping through Driver maturity they may be able to target the high power draw......but it certainly makes sense as to why Crossfire is not an option currently.
You should not take my answer as the truth.
Really someone from AMD Support should be answering this for you.
You should take my answer as what I am assuming based on lots of investigation into what happened to this Vega GPU Launch.I have been briefly in contact with some reviewers / developers, attended Livestreams, watched many reviews, followed this launch in great detail. I wanted to purchase an RX Vega 64 Edition at launch, but I am not in a position to make the purchase because, quite frankly, I am stunned at how bad the Performance versus Power Draw is on these cards at the moment. The Compute performance on the RX Vega/FE and the introduction of access to Radeon Pro software for "Creators" is great. The advertised launch price was great. FreeSync versus GSync is great.
However, given the Performance/Power Draw, at the moment, the cost of ownership difference cannot be ignored, especially if you want to run Vegas in a MultiGPU setup. I would have to either purchase another Corsair AX1200i and just use it to run the extra card(s) or replace the existing AX1200i unit with a larger 1500W power Supply (probably from Corsair because my systems need to run 24/7 and downtime for me) = no $$$. I am not a Miner .. before you ask.
So given all of this the poor Performance per Watt means higher cost of ownership and possibly an extra 400-500 spend on a new PSU in the future if I manage to purchase a second RX Vega 64 Liquid ... (Can't even get my hands on the first one even if I was confident to make the purchase, they were still on pre-order).
Now ... having said all of the above it is not all bad news on the Power versus Performance front. I have been following what is happening with people who are trying to reduce the power draw on the RX Vega/ Vega FE and also what Overclokers are up to.
You may look for my other posts if you want more detail on this but a summary of what I think is happening could be this.
My theory is that HBM2 chips supplied for Vega came out at a lower speed than expected. The Vega Architecture performance seems to be highly dependent on the HBM2 speed (frequency).To compensate, the GPU clock speed may have been increased to improve performance at a serious cost in terms of power draw. HBM2 power draw is very low. Increasing HBM2 speed costs insignificant amount of power versus increasing the GPU Core clock.
If I am correct (it is a big if... ) Then it should be easy in future release of Vega to replace the existing HBM2 with faster HBM2 and reduce the GPU clock and improve the Performance versus Power Draw significantly. This is why I an still very interested in Vega.
There could be other reasons why the Performance/Power Draw is not as good as hoped for. The Charted process used to fabricate the Vega GPU might have come out at an unlucky process skew (the silicon lottery), or there might be some circuit design issue / improvement needed ... maybe more work is needed to reduce clock tree power consumption. I have no way to know.
Also the Draw Strem Binning Rasterizer is new and it might not be working correctly yet. It is supposed to help Power/Performance.
All of the above will not help anyone buying a Vega GPU at the moment though ... there is always the possibility of something better around the corner.
Back to improving Performance Versus Power on Vega 64 right now based on what I have seen so far.
My recommendations, based on everything I have managed to glean from reviews and the Internet ... which may be total rubbish and is obviously based on a very small number of sample from multiple sources are as follows.
Edit: I forget to mention ... before doing any overclocking/undervolting you should set your Power, Temp, and Fan Targets to Maximum Initially ... as a starting point. Start with the Core and Memory clocks and Voltages set to default. Then maximise the Power Target and that will give you largest jump in performance, see if you are happy with that and measure the power draw. If you have an UPS (APC Back-UPS Pro 1500) or a cheap (but accurate) plug in power meter that is probably the best way to check. Also some "intelligent" PSU's like Corsair AX1200i have software (Corsair Link for example) which give pretty accurate representation of power draw into your PC.
(1). Don't run it in Turbo mode at all, run it in Balanced mode or Power Saver Mode out of the box. If you watch the Adored TV Video above you will see in Prey how little additional FPS is achieved by going from Balanced to Turbo Mode. A few FPS for a massive jump in Power Draw.
(2). Currently Wattman is reported to be very unstable and difficult to use but ... if you can get it to overclock, run the HBM2 Memory clock as fast as possible.
(3). Again, Wattman issues aside, Undervolt the GPU, at the same clock frequency and then see if you can increase the GPU Clock a little. The power savings you can get are not insignificant, I have seen reports of 40-80Watt power reduction for the same /better performance level. Terms and Conditions apply: You will have to apply your overclock on a per application basis, it will not work on everything, and it might not work at all on others.
(4). Here are some rough figures for Performance Improvement versus Clock Increase on HBM2 versus Memory and GPU Core clock off the bat. (A). Increasing HBM memory clock is most effective. Every 3% increase in HBM2 clock you get about 1% Performance Increase for insignificant increase in power. (B). Increasing GCLK seems to behave as follows depending on the HBM Clock.
GCLK runniing between 1600- 650MHz.
You see a 0.6% performance improvement per 1% increase in GCLK.
GCLK running between 1600 - 1650MHz.
0.68% performance improvement per 1% increase in GCLK.
Again from the above you can see that it appears pushing up HBM2 CLK up also improves the benefit of increasing GCLK.
I really hope this helps you out. I really want to see Vega be a success, I wish i could get my hands on one to do some testing. The prices of the cards are high or on pre-order at reasonable prices at the moment. If I were as lucky as you to own two of them and I had the time I would probably get involved in the AMD Vanguard program if I were accepted and try to help out that way.
You might want to try that out: https://gaming.radeon.com/en/radeonsoftware/crimson-relive/vanguard/Bye.
Personally, i think it's a brilliant idea that crossfire is not available(cut out the miners on the limited availability), let the majority of cards get to general gamer users.
Hopefully the Vega56 also is crossfire blocked for some time.
Crossfire DX11 is 3D Graphics technology for gamers.
I don't think Crossfire helps Mining, I think it may hamper mining performance. Maybe there is something new about Vega that needs Crossfire turned on when Mining but I am not aware of it.
Even so ... If you are worried about Crossfire Users using more than 1 GPU I think that is insignificant versus the number of GPU's used in Mining.
In reality most Crossfire DX11 users will only run with 2 Cards, as scaling with 3 or 4 cards has a rapidly reducing return as most games are not optimised for that. Three or 4 way Crossfire is rare and where it is run I think it is usually for Synthetic Benchmarks like 3D Mark Firestrike/TimeSpy etc.
If you want to run an RX Vega in AAA title on 4K display on High or Ultra settings in Game you are likely going to need a pair of RX Vegas running in Crossfire DX11 or in MultiGPU DX12, especially if you want to run it on some of these very high Refresh Rate FreeSync Monitors. Understood those are very high end specs but RX Vega a high end GPU.
Crossfire has nothing to do with mining.......when you are mining you have to disable crossfire.
Fully understand.....thanks. What you have said make sense to me.
Ultimately we can only wait and see what AMD does I guess.
I appreciate your in depth, detailed responses. :-)
Well if MSI, Asus, Sapphire etc. lists this on their sites and on their product boxes they probably got their information from AMD don't you think? And if AMD says that their product will support it (which they have and shown RX Vega in crossfire), but then don't make the option available on their products I would say that is a good reason for them to be sued both by the manufacturers and by the consumers. It's simply misleading if this were there case. nVidia where sued for far less when it came to the Geforce GTX 970 4 (3,5) GB scandal.
I do believe that crossfire will be available, I am not worried about that but what is a little disturbing is that we are 1 week in to the launch of Vega 64, and AMD is being very quiet about it when asked (not only here, by techsites, reviewers etc.). A simple statement, or mentioning that it will be available at a later date would simply calm things down.
And on top of that there is no real reason why FE Vega does have crossfire available (and the minig drivers as you mentioned) and has had it since launch, and the gaming focused RX Vega does not have it. Even if they are on completely different branches of drivers. That's just the driver teams/teams being lazy, 290, Fury and Polaris series has all had crossfire available with the launch drivers (if I'm not remembering this wrong).
The power thing I am not really buying, if that is AMD's reasoning. FE Vega is just as powerhungry and have crossfire in drivers.
Yes, i agree......it is very deceiving and I have sent MSI and email regarding it.
I also believe that Crossfire will be available.......Vega is brand new and so are the drivers.
I will be patient and wait.
Yes, you are right......Polaris had crossfire available as I purchased to RX480's on launch. IT was extremely buggy.....but the option was there.
Where as now, it is not. Which leads me to believe they have some tweaking and refining to do.
RE: The power thing I am not really buying, if that is AMD's reasoning. FE Vega is just as powerhungry and have crossfire in drivers.
Did you see this review: AMD Radeon Vega Frontier Edition CrossFire Testing | PC Perspective I am really pushed for time but maybe you can have a look and see what PSU they tested it with. They do not seem to mention power consumption in Crossfire though.
In any case I believe that the RX Vega 64 Performance in gaming has improved significantly versus performance of the RX Vega FE.
The Vega FE was only performing in between a GTX1070 and a GTX 1080 I think, wheras the RX Vega 64(Air) , is now ~ trading blows with the GTX1080. The RX Vega 64(Liquid) looks to be better than the the GTX1080 but probably about 15-20% slower than the GTX1080Ti when run in Turbo mode. I don't know how they did that. Do you?
Is it just the Racing Red colors on the logo on side of the card?
It might be that DSBR was turned on in the RX Vega 64 Drivers (it was off in the Vega FE) and it gives a significant performance uplift and power reduction as intended. In this case though the power consumption on the RXVega 64 should be lower than the power draw seen on Vega FE at the same clock speed on the same application. The FPS should be higher on the RX Vega 64 than the Vega FE. What we really need to see is the Vega FE running Prey on the original drivers.
It could be that the clocks run faster on average on the RX Vega when it is run in Turbo Mode (voltages power limits and fan speeds may be changing as well) so the card can hit the highest benchmark numbers versus a GTX1080. If this is the case then, the power draw on the RX Vega 64 and the FPS will be higher than the Vega FE.
Perhaps they have binned out faster, more power hungry Vega GPU IC's for the gaming cards?
Perhaps they did not measure the power draw in Crossfire at all and did not realise it could get so high in some games?
Again I do not know for sure. All pure speculation from me. No other way to know. - That is a problem in itself.
There were new Drivers released for the Vega FE just recently, first ones I have seen since it's launch. You can find them here: AMD Drivers You want to look at the Beta version.
The release notes do not seem to mention gaming performance at all, so I do not know if they have picked up the RX Gaming Driver settings or not when the card is in Gaming Mode. No mention of it. I do not know if Crossfire runs on that driver, although the release notes seem to imply they do.
I have not seen any more reviews about the Vega FE Gaming Performance or Gaming Benchmarks run on the Vega FE, and I doubt we will for some time. I am getting the impression that many reviewers are tired of Vega at the moment and are moving on to review other products.
Retrieving data ...