You remember how California banned plasma screen TVs because of the power draw? I have a feeling Vega64 may be next...
Image running two of these Vega 64 cards in Crossfire! If they would even fit on your motherboard ...It would be good in winter (somewhere near the poles) to sit in a room with > 1KW power consumption I guess. That's the power consumption of a single bar on an electric heater!I am not even sure my AX1200i Power Supply could take it.
The power draw and 2.5 slots required is too much for me. 2.5 slots mean you loose yet another PCIe expansion slot on you motherboard.
I much prefer a single Vega Liquid Edition card. At least it is 2 slots wide. It should run as well as this.
If it doesn't you could always increase the size of the radiator the Liquid Edition and add a few more fans on the radiator. Maybe AIB's for the Vega 64 will concentrate on "Extreme Liquid Coolers" like this.
I really hope the AIB card manufacturers will concentrate on selecting especially fast HBM2 chips, overclocking and undervolting Vega 56, and modifying it's BIOS to increase power limit instead.
A watercooled. Vega 56 with faster HBM2 and increased BIOS power limit would probably be a good choice.
Gamers Nexus/Buildzoid are investigating that.
My theory is 6 months ago AMD discovered that Vega 64 was only going to be a bit better than the GTX 1070. There was no way it was going to be as good as the GTX 1080 never mind the Titan X. So they overclocked the cr*p out of it. Resulting in high power draw and temperatures.
I can't imagine the AMD Partners are too happy with them. There's pretty much nothing left on the bone for ASUS or Sapphire to work with. It's the kind of card you want to come with a 3 year warranty. I can still picture my 2600XT going up in smoke.
Draw Stream Binning Rasterizer was supposed to help both Performance and Power
There was lots of talk about it. Where is it?
Where are the examples of it running in a game and showing any performance improvement at all?
Is it even in the Drivers used run the Benchmarks for Reviews.
No one seems to really know.
PCPer know it was not enabled in Vega FE indicate they were told it might give a 10% performance improvement in games running at 4K Resolution.
The fact that there seems to have been an "Emergency Late Minute Switch" in the way the Vega56 Cards were released to Third Party Reviewers does not feel good for my interest in Vega64 Liquid Edition Cards. Last minute the emphasis was pushed over to the Vega 56 Reviews.
Third Party Reviewers are really being pushed to the limit it seems to me.
Wattman was plain broken in the Vega 56 Review Testing for example.
I just watched a YouTube Video from some card reviewers and I got the impression AMD might just get a "Radeon Rebellion" for real from them.
I am an "Overclocker for fun".
Now I don't mess with LN2 Cooling ... but I do run lots of GPU benchmarks, try out more practical "special" cooling options if they are around , and I am keen on Crossfire / MultiGPU. I was really looking forward to trying out Vega 64 Liquid Cooled. Been "Waiting for Vega".
The more I am seeing on the Performance versus Power Draw on Vega 64 Liquid Cooled though, my heart just sinks.
Either Vega 64 it is broken or AMD actually think Power Draw like this is acceptable.
Possibly having to upgrade your Power Supply to power RX Vega 64 Liquid seems to have been dropped from the whole FreeSync versus GySync argument.
AdoredTV has stated in the review that they are in a situation where they would run Vega64 Liquid Edition Card, in "Power Saving Mode" because the performance increase by moving to Balanced Mode, and especially the 200W Power Increase moving from Balanced to Turbo mode for a few FPS more is just not worth it.
Just think how bad that is. If you spend lots of money on a High End Liquid Cooled RX Vega 64 card and you find out that their is really no point to run it in Turbo Mode.
AdoredTV was testing Prey running on 1 card.
One RX Vega 64 Liquid card in Turbo mode was pulling 569W running at 114FPS versus a GTX1080Ti running at 385W hitting 136 FPS.
The GTX1080Ti was shown to be 30% higher performance at the same power draw in Prey. That is simply way too far behind.
Prey is the same Title that AMD were showing a pair of RX Vega 64's running in Crossfire at Computex!
So if those cards were running in Turbo Mode, they could have been pushing on for ~1.14 KW of power for the demo alone.
It seems like that is a good candidate for an Internet Meme for Nvidia once they catch on.
As for the Vega 56 it looks good against the GTX 1070 for the launch price, but not so good against the GTX1080 if the Vega 56 price goes up.
I keep asking for more information about DSBR Situation, drivers, any known issues that may affect these results.
What's happening with Vega FE Gaming Drivers for example.
Unless things change soon, I think many high end gamers will just go out and buy a GTX1080Ti which costs the same as the Vega 64 Liquid Edition anyhow.
I still hang on for more information about the gaming performance and improvements because I am interested in the Compute Performance as well, many others do not care about that though.
This is not what I want to see, and I still hope that AMD will officially respond to the initial Third Party benchmarks and the reviews. The more I think about it the more I feel they really need to respond.
HBM2 let them down for sure. News about that back at the start of February, 6 months ago I think.
I think this is one reason why Vega is late.
Look at the Memory Bandwidth of Vega 64 versus R9 FuryX. It is lower than the Fury X.
I think that might have cost maybe 3-6% in performance.
So what do you have left to push up performance ...Push GPU core frequency up. or turn on new architecure features.
Well we know about the GPU Clock increase ... information on what the new features are doing for gaming right now seem to be MIA.
I was planning on buying Vega even if it was only equal in performance. When early benchmarks finally leaked, and I saw the performance. I gave up and decided to go "all in" on NVidia and buy a gsync monitor, which I really like btw. The vega power draw is a bit absurd. We need AMD to keep nvidia in check with competitive products, and it's not looking good.
I mean look at what AMD did to intel. Intel is releasing the i7 8700k on Monday, which is a 6/12 CPU, and a i5 6/6 CPU version. This would not be happening without Ryzen. I will be building a new computer based on 1700/1700x or 8700k. I am just waiting on the benchmarks.
Nothing wrong with going Nvidia at all. Especially given the Benchmark Data and evidence on RX Vega 64 right now.
If you are not that interested in GPU Compute and you mostly interested in gaming then I think a GTX1080TI or GTX1080 is the best Power v Performance choice. The extra power pulled by the RX Vega 64 (the additional cost of ownership) versus the GTX 1080 / GTX1080TI also reduces the FreeSync Versus GSync argument as well, unless something changes with Drivers or new Vega Feature enabled to change that.
You can't really say anything about the graphics market right now as cryptocurrency miners are grabbing up cards no matter the cost as soon as they are available, AMD and nVidia catering to this market with mining drivers, board partners increasing the prices to capitalize on the boon, retailers increasing the prices because they can, and every world regulatory body is refusing to bring down the hammer on anyone in the chain. Really your only choices for graphics cards today are the lower end, like the RX 560, which is still around $160, the slightly higher GTX 1060 for $250, the high end, like the $500 GTX 1080 or $700 RX Vega 64 (which is a no brainer not to choose vs $500 GTX 1080), and the ultra high end, the $700+ GTX 1080Ti. AMD and etailers are exploiting this fact with "Radeon Packs" which essentially a legal way of overcharging. It's insane, and it's only going to get worse.
The way things are going, if my Fury Nano dies, I'm going to pop in my old HD 5450 and deal with it.
I am surprised that anyone could make their money back with those prices. But I am largely ignorant when it comes to mining. Never really looked into it at all, as it never interested me. Although, I do know the affect it has had on prices.
I found this to be a great introduction to the World of a Miner.
Radeon RX Vega 64 Unboxing and Crypto Mining Ethereum (ETH) at ??? Mhs - YouTube
I think its great because.
(A). It is a cat video.
(B). It shows to me that Miners are humans too.
(C). It tells me where all the RX480/580 8GB cards have gone.
(D). It tells me why AMD must love miners, 6 RX480's per motherboard ... He has more AMD Cards than I have ever owned yet.
Here is the 6RX480 8GB Mining in more detail: 6 RX 480 8GB GDDR5 Overclock Graphics Cards All in One Computer! Cryptocurrency Mining Rig Part 3 - YouTube
TomsHardware did an article on it, if an individual could use old parts and make a profitable mining rig at 97MH/s depending on where you live, using a Fury, 390X, 380X, and 380. They were able to do it, pulling in a profitable machine even if Eth were to drop below $125, if you live in Canada or Texas where power is dirt cheap. It's not much if Eth gets cheap, but a profit is a profit for something you set and forget: http://www.tomshardware.com/reviews/spare-part-ethereum-mining-rig,5143.html
Looks like the new AMD Blockchain Driver has pushed the HashRate up to 41 MH/s ...
41 MH/s Hashrate: RX Vega 64 Mining Review Update - YouTube
RE: Radeon Packs.
I think if you are doing a new build, the idea of the Bundles / Packs is fine. I am sure many people have been waiting for Vega to be released and may have decided to do a new AMD only build at the same time.
There must be many people who have taken the decision to move from Intel to Ryzen or Threadripper for their next build. If the combinationss of new CPU / Motherboard and Vega GPU do actually save you some money over buying them separately then that's great.
Regarding the new bundles with the new Samsung monitor I have seen people complaining about issues with flickering on that monitor when it is set in Ultimate mode. Gamers Nexus just tested it and it looks like the monitor works fine with Vega 56, at least with the testing they did. That is good news I think. The monitor looks nice to me. Here is the Video it is worth a watch: Samsung CF791 Flickering Criticism Tested (Vega, Polaris) - YouTube
Same thing was found on the Fury series as well by TomsHardware. Power consumption and heat would plummet while performance remained the same or improved.
Undervolt AMD's Radeon R9 Fury With MSI Afterburner - Tom's Hardware
Yes I undervolt my R9 Nano (FuryX) a little in and it helps a bit.
Still it is encouraging to see there is some power saving and increased performance. That is also in a situation that Radeon Wattman and the other overclocking tools do not seem to work correctly with Vega yet. Some applications may not be able to be undervolted at all, some will, power saving and performance increase will vary...
I have not seen any trying to undervolt the GPU and push up the GPU and HBM clocks as high as possible but maybe that is not possible? Also maybe,... just maybe DSBR is still not working properly although that maybe give a 10% performance increase at 4K? I have not seen clear information on how much power it should save. It is supposed to be on in RX Vega though.
The RX Vega Wattman seems to be in a bit of a mess right now. Every review seems to complain about that Hopefully will get fixed soon.
All of the above + replacing the radiator on the liquid edition with a larger triple fan version and maybe this RX Vega 64 might j nearly make it to be nearer to a GTX1080Ti running at stock speed for ~ same power draw. Might get from 30% behind to maybe 15% being optimistic? Keeping the temps on the GPU core to below 60'c seems to be very important on the Vega according to Buildzoid video. Not know why just yet.
Anyhow I am hoping that AdoredTV may look at running Prey etc with an HBM2 memory overclock and/or GPU undervolt/overclock attempt soon. or Gamers Nexus/Buildzoid/PCVPerspectives may push on with their work. I would really like to see some of the Third Party reviewers investigate DSBR if they can. I guess I should become a Patreon supporter to them.
I think that is pretty much all I will be looking at on Vega for a while now. My conclusion on the situation is it seems to have been launched before the drivers are ready, the VBIOS potentially needs an update and right now no-one really knows exactly where it sits versus GTX1070/80/80TI and the price is not clear at all.Back to wait and see mode for me.
Thanks for all of your help and info and comments. Bye.
OK so I have been hoping to see improved Performance/Power on the Vega 64 Liquid Edition ... since I am thinking of purchasing. I thing overclocking the HBM2 will be of the most benefit, combined with undervolting could put the Vega 64/56 in better competitive position w.r.t. Performance/Power.
Here is a summary of the data I have seen so far. I took my own notes and I thought I may as well share ...
(1). Gamers Nexus on Vega FE.
Undervolting Vega FE: Fixing Performance & Power - YouTube
Undervolting on Vega FE with fixed 1600MHz Clock, and setting Power Target by +50%.
. Undervolting needs to be set on a Per Application basis. So setting an undervolt in the Profile Wattman setting would be the best.
- Increase Power Limit by +50%.
- 110mv Undervolt
- Temp Drops from 73 to 63'C - Thats a 10'C. Drop.
- Locks are more stable @ 1600 MHz and therefore that might improve performance.
- Might then be possible to push clocks even higher.
- 86.353 Watts Power Saving at the at same clock speed and therfore application performance.
- Power saving is worked out as follows.
12.3V * 30 = 369W.
12.289V * 23 = 282.647
Delta = 86.353 Watts Less at same clock speed (and performance).
(2). Work done by Buildzoid on his Vega 56 sample. =====================================
Buildzoid Videos are here: Buildzoid rambles about the RX VEGA cards - YouTube
Ramblings about VEGA 2: BIOS modding, power play tables, 2GHz core clock and performance scaling - YouTube
BuildZoid : Testing with TimeSpy DX12 but on the Vega 56.
Buildzoid looks at increasing the Power Limit versus that set in the in the Vega56 Bios. The Bios is locked down on the Vega Cards. You cannot change the power settings flashing a modified bios. There is a security lockdown which prevents POST.
There is a workaround by setting Power Limit in Windows Registry Tables. This workarouns was used to change the Power L:imit on the Vega 56. The card was tested running TimeSpy on DX12.
For GPU Core and Overclocking.
Running the Vega 56 at GPU Clock = 1680MHz, HBM2 Clock at 1050MHz. (Already 13-14% Higher than what Vega 56 normally comes with 165W BIOS Limit, versus Power Limit of 220W for Vega64 and FE).
For Vega 56 a 300A 255 Watt +100% Power Limit. taken as the base in the following table.
Testing for Performance versus power limit scaling.
Running the Vega 56 at GPU Clock = 1680MHz, HBM2 Clock at 1050MHz. and tested Power Limit increase from 0% to 100% with a 250W Base.
Timespy Graphics DX12 Graphics Test Score Alone.
- 1680/1050 1.225Volts.
Power Limit set in Registry Tables. Time Spy Score.
255 W +0% = 7000
288 W +13% = 7241
320 W +25% = 7481
352 W = 7610 ---> This is the a sweet spot. Score stops increasing above this.
382 W = 7630---> Within margin of benchmark variation.
446 W = 7609---> Within margin of benchmark variation.
510 W = 7617---> Within margin of benchmark variation.
HBM2 Vreg. =========
Vega 64 and Vega FE.
HBM2 Vreg pulls 20W max. It is 1.35 volt VRM. This Power draw It is nothing compared to the GPU Core.
Recommendation is to Maximize the HBM2 frequency. Run HBM2 as fast as possible because it has the most impact on improving the performance of the Vega GPU.
800MHz - 1100 MHz = for every 3% increase in HBM2 clock you get about 1% Performance Increase.
You can definitly get 10-15% HBM2 Memory Clock increase on the 1.35Volt cards = 3-5% Performance Increase for free! Very little power increase.
HBM2 VRM only pulls about 19-20 Watts Peak.
For HBM2 Memory Overclocking on the Vega 56.
===================================Vega 56 ships at lower HBM2 volts (1.30) to save a little power and so lower HBM2 frequency on the Vega 56 and therefore you will not get > 1GHz on HBM2 on the Vega 56.
Core Clock Overclocking Margin.
Increasing the GPU Core Clock gives less performance improvement per increase in clock frequency than increasing the HBM2 clock.
Tried Vega 56 masses of problems getting stable overclock due to software issues but here are the results.
above 1GHz performance gain starts to drop off.
GPU Clock range 1600- 1650 MHz
0.6% performance improvement per 1% core clock increase.
GPU Clock range 1600 - 1650MHz
0.68% performance improvement per 1% performance improvement.
LN2 needed for higher HBM Clock but tried it.
1600 - 1800 = 8.6% perfomance increase for 12.5% clock increase.
0.69% per 1%
The core clock performance improvement definitely benefits from faster HBM2 Clock.
I just found this article ... it shows some interesting and encouraging results with undervolting the Vega 64 and 56. It is written in German so if you cannot read German I suggest you open it in Google Chrome and use Google Translate.
They set both the Vega 64 and 56 to 1000MHz HBM in this article.
Looks like ASUS was able to do a bit more tweaking, power draw is lower than the reference edition (slightly), though it's nowhere near anything which could be called "efficient".. ASUS Radeon ROG RX Vega 64 STRIX 8GB review - Introduction
I do not see how the AIB cards can do much better in terms of power/performance by just using a bigger/better aircooler. Maybe some minor Vbios undervolting might improve the power draw a little but they would have to be careful with that.The reference card for all of the Vega GPU's (not seen the Nano yet) is using a very good Voltage Regulator with very high quality components and it is already highly efficient based on the reviews/breakdowns I have seen. I think improvements to the Voltage Regulator will be minimal on AIB boards. I think the only thing the AIB partners could do on the hardware side is get faster HBM2 and maybe specially binned out High Performance / Low Power Vega 64/56 chips.
I am still not convinced that DSBR / other new features are working properly on the Vega Drivers yet. I think it might be the case that AMD are concentrating on fixing and improving performance on drivers for Professional/Workstation/Instinct cards first, Miners Second, and Gamers last. That is the likely $$$ priority.
If what I read about Vega64 before its launch then it does have groundbreaking new architectural features, lots of Compute Power, and features that should have improved Performance/Power over what we see today.
Already I start to see some very impressive Blender GPU based Rendering Performance numbers roll in when running on the Blender 2.79 Release Candidate 2 versus R9 Nano (FuryX) and NVidia Cards, on trhe same release for example. Hopefully we will see Blender Rendering with with High Bandwidth Cache Controller engaged next.
I just wish AMD would hold a Video Conference or Presentation to explain where Vega 64 is right now w.r.t Power/Performance in gaming and give feedback on what is and is not working in drivers at the moment.
I'm not sure there will be alot of new improvements, including those new features, coming to Vega. I mean it's not like the cards were rushed out. Engineering samples were sent out months before the product release. And I'm not even sure how long it was in development for. Two years? AMD knew where the Vega was going to stand in the graphic card line-up. You'd think if there was stuff they could have done to improve it performance and power wise they would have. They had time.
Kind of the like the crossfire drivers people are waiting on. What's the point of having a driver if game devs aren't going to implement it when they make games? Most of the new titles that have it run better on a single card anyway.
I do not know what has happened getting up to the launch and RX Vega / FE is where it is right now.
I am still interested in Vega 64 Liquid because of OpenCL Performance and Rendering, not just gaming. I am not personally interested in Mining ... due to electricity costs here more than anything.However for the RX Vega / FE to be a success overall, it will have to do well in gaming as well I think. Why would a game developer buy and develop on a Vega FE if the RX Vega do not do well, for example.
Regarding Crossfire, it might be disabled at the moment in Vega because of Performance/Power on the single card. I do not know. New AAA Crossfire (DX11) and DX12 MultiGPU titles have mostly been working well for me over ~ past 2 years since RX480 launch.
There are a couple of games I am looking at at the moment with Crossfire issues. Mass Effect Andromeda is not running well for me since the initial Crossfire Driver Support. There have been ongoing issues since 17.2.2, 17.4.4 was almost there, now 17.7.2 the Crossfire Profile is gone and the game runs badly for me when I try to use AFR Friendly mode. I am looking at 17.8,2 and still Crossfire Profile gone.
The latest patch of Mass Effect Andromeda runs quite well on both Windows 10/8.1 64bit with the 17.4.4 driver, so it looks like changes in the 17.7.2 driver are the cause at first sight. The Witcher 3: Wildhunt .. when Chill is turned on. (a brand new feature in 17.7.2) shows minor flickering on water textures, otherwise I can game at 4K, 60FPS with a pair of R9 Nanos with no significant problems seen yet.
Back to RX Vega/FE ... I think it woulld be good if AMD did give us all a status of the drivers, including what is and is not working well right now. That way I would have more confidence to purchase one or two RX Vega 64 Liquid or Vega FE (Air) for my next build.
Retrieving data ...