1 vega 64 is about 40% faster in performance if tested against 2x R9-295's .Pit bullying 2 cards made in 2014 vs a Vega 2017. The bandwidth alone blows the doors off of them ....
That being said this is really what your going to want to consider. HDMI 2.0b will have nothing on 2.1 which will be true 4k-8k gaming with that on table the R9's max at 2560x1600 which is the standard bandwidth on 2.0-2.0b HDMI which has total of 18GBps were as 2.1+ introduces 48GBps. Now does the vega 64 have 2.1?
No, its not even commercial yet, but for sell in a small niche select 1-2% of people. 2.1 will be arriving soon store side, they just need to stretch it out some, probably for the money.
I would wait on HDMI 2.1 before upgrading anyting, you have 1080p + Freesync without a hitch using the R9's. Why switch? If its for the 4k, I explained that above. 3 years ago today people (thought) they were going to have 4k gaming, 3 years later they still don't. But if your switching for another reason I can understand
I have the dual gpu card idk how much of a difference that actually makes I refuse to run crossfire or multimonitor anymore because of the problems it can cause while gaming so I buy dual gpu cards and ultrawide monitors.
The 40% more power does make sense I figured that was probably the case especially with the hbm but I was looking up some benchmarking and it was showing the my card sometimes was keeping up with vega and in others it was being blown away so I was looking for conformation on it.
I'm actually not that interested in 4k I prefer ultrawide monitors. I'm only asking because I'm looking to upgrade my monitor sometime between now and 2018 if they release what I'm looking for anyway and I'm at the point where the rest of my computer could use an upgrade to keep up with all the newer stuff at max coming out. So I run 1080p 21:9 I want 1440p 21:9.
8gb grill sniper 1886 I think
Ssd boot drive win 7
Wd black 5 tb harddrive
If your trying to future proof your PC build forget about HDMI. Wait and look into DisplayPort, HDMI will become a plug that's just there if you don't have anything else.
DisplayPort(DVI) was one of 2 connections available for a long time, and always trying to replace it. It started with 18gbps the news is way up in the air but the word is HDMI for PC's will never be as good as the improved DVI ports they are going to manufacture.
On paper DisplayPort is a technical masterpiece.Consumers usually know of HDMI, but few know of DisplayPort, its that (port) everyone assumes is legacy, but the reality it is not. On the mobo maybe, but the newest cards sport the real DVI and its gonna cost you a ton. The cabling alone might cost half as much as a monitor to run it. In electronics you get what you pay for, there was never a dollar to save only transistors.
Vega 64 is about 75% faster than a 295x2 with Crossfire disabled, or about par with Crossfire enabled. I would also strongly advise against buying a Vega card right now as they are still too high priced and scarce thanks to the cryptomining arsehats, as they are ~$150 higher than a GTX 1080 but are effectively the same power (higher performance but few to no cases where it provides 60FPS where the GTX 1080 does not). Honestly, you should stick it out if you can, because GPU prices from both companies are still at the ripoff level (AMD because of packaging issues limiting supply and arsehat miners grabbing them up, and nVidia because they have no reason to lower them).
Yea I knew about dp being superior I'm running it now. HDMI seems to be closing the gap a tad with 2.0 and 2.1 but data transfer is still quite a bit higher with dp so I wasn' planning on switching.
And Zion your saying my 295x2 is on par with vega cause if thats the case I prob wont upgrade until the next gpu comes out or maybe dual gpu vega if they make it which I assume they will.
Correct me if I'm wrong but there is no "crossfire" on my card since its 2 gpus smashed together on one card if I'm wrong I'm gonna have to see if crossfire is enabled I might be hindering my performance
Yes you're wrong. Be it two 290Xs or one 295x2, it's the same thing. Also, there's never going to be a dual Vega64 GPU card for the consumer market, as a single one drinks over 350w of power, and because it'd cost on the order of $1300 which nobody would pay except an arsehat miner.
I also fail to understand why you want a dual Vega setup when you're not even wanting to use Crossfire now?
Well your right I dont wanna deal with crossfire and someone misinformed me on how dual gpus work. That being said I wouldnt be getting a dual gpu vega now that I know that and I didn't realize the power requirements for veg a thats crazy.
My 295x2 was 2k when it came out by the way, I see didn' pay that much I'm just saying some people are willing that arent arsehat miners.
Thanks for all the info I know what I'm ok gonna be doing now
1 of 1 people found this helpful
Dual GPU cards do not work like a RAID0 array, the software sees two physical GPUs and performance is based on how well, or poorly, the software is coded to use multiple GPUs. Also starting with Vulkan and DirectX 12, multiple GPU performance is solely the responsibility of the developer.