1 of 1 people found this helpful
Even if you could I doubt the experience in crossfire would be pleasant. GDDR5 and HBM are going to be very different animals.
my graph at the bottom of my page on game performance has the R9 400 series reference numbers. The R9 300 series are going to used for OEM components.
the upcoming R9 490X will be very competitive with the GTX Titan X especially when AMD launches at a more aggressive price point.
I must say i'm very disapointed with the 300 series.
I've been "defending" AMD/ATI since the HD4000 series and i tried to inform people on other tech forums about the quality of AMD GPU's. That they are neck to neck with nVidia ones except on TDP and Heat side, but that they are also cheaper in general...
Now i just saw that AMD went suicide mode and rised the TDP for a whooping 375 Watt !?!? After a delay of months, that's what you managed to do!?!? Now i can't even argue with the "green" side that say that their GPU's are more expensive but you'l cut that on the electricity bill in 6 months... Now it doesn't matter if the AMD Graphic cards are cheaper because you will close that gap in the electric bill in 2/3 months and they will get much more expensive than nVidia in less than 6 month.
Every "green" side fanboy are now mocking AMD owners on the forums and the AMD fans are really disapointed. (At least from where i come from)
I was really hoping that AMD kicked those "Green" snobs this time, but unfortunately AMD just commited suicide with this one. And after almost 8 years on the AMD side i must quit and change sides.
Something tells me that you wont last long after this one.
I really liked AMD's consumer, gamer and open source policies... I think that we all loose when you fall.
Where did you get 375 Watt? Per leaks the 300 series consumes about 20% less power than 200 series and we have 0 gaming benchmarks. I don't get it, all these people show 3d Mark scores but 0 are willing to show gaming scores. If leaks are right and there is 0 performance improvements that would mean the 300 series is on par/more efficient than Maxwell... and that's before HBM.
So you are complaining about an unreleased ultra high end card using 375W? And you claim that Nvidia cards would "pay themselves pack in electric bills in 2-3 months?
I am not really sure where you are basing ANY of your information from. As from what can be seen online, even the R9 Fury X (the highest end R9 300 series card) will only use 300W.
Also, I guess you must be looking at the really low end Nvidia cards power consumption.
Just from looking at online information, you can see that in tests (by a AnandTech reviewer) the GTX 780 & 980 Ti using well over 375W and the 690 & Titan X going over the 400W mark in many tests? And even the GTX 680 & 580 are shown to use over 300W under max loads.
If you have proof which says differently from the above information I would be most interested.
All I know for sure is that the speculation etc will be silenced when AMD brings their latest GPU to the market. I am aware that the new GPU is very consistant with my current values for stock performance, obviously OEM overclocking will offer slight differences
According to the E3 calendar, AMD is up tomorrow so I am hoping for a new GPU at that time which will drive the price of older parts down where I like it
I bought an 850W PSU after noticing that extreme video cards required an extreme PSU to operate them.
My HD 6970 is old, but make no mistake, the flagship is so extreme it beats consoles but it's a power pig, cranked all way it can guzzled over 300W of power
There are no speculation when a vendor lists OFFICIALLY the release and sales of a graphic card and it states that the Power consumption is 375W:
Just click in the R9 390X specifications and see for yourself.
This site is no where near of being trustfull... Most of this site "news" are clickbait. Just check some older infos and you'l find out 80/90% are wrong and false.
This one... Well... I strongly recomend that you read things propperly.
It clearly says: "TOTAL SYSTEM power consumption"
So, just to go down your thought path. How much would you say is the power consumption difference between this page with power consumption of one of the "highest" end graphics card, verses a Nvidia card of the same range? Would you say 75W? 100W? Ok, lets call it 100W just to be on the extreme side of things.
Now I don't know how much you pay per kWh in your state, but I do know that the avg cost is 12.35 cents per kWh (lowest being North Dakota at 8.78 cents & highest being Hawaii at 31.2 cents). And we will use the average cost for my calculations here.
We will also run our "test" system at 100% load 24 hours a day, 365 days a year.
Now the calculation is: Wattage (100W) x Hours Used (24 Hours) / 1000 = kWh
Therefore 2400 / 1000 = 2.4 kWh per day (or 29.64 cents per day). We will average that to 30 cents per day, just for fun.
Multiply 30 cents (or $0.3 Dollars) per day by 365 days a year: 365 x $0.3 = $109.50
So even if you were to run your "super high end card", 24 hours a day, 365 days a year, with the average cost of electricity you would still not "Close The Gap" (as you put it) with the increased cost of a high end Nvidia card.
But in reality no one (except maybe "Coin Miners") would ever use a Graphics card to this degree. And unless your "job" is playing video games (dont we all wish we could), most people use the Graphics card for any form of high end gaming, at most, say 6 hours a day (even this is a lot for the average person).
So 6 hours of high end gaming, every single day would cost (at present prices) on average $27.38 PER YEAR.
So please explain your "you get your money back with an Nvidia card"?
Electricity Cost for USA States - http://www.eia.gov/electricity/monthly/epm_table_grapher.cfm?t=epmt_5_6_a
Simple Electricity Calculator - http://michaelbluejay.com/electricity/cost.html
1st of all, i'm not from USA, nor most of the world as far as i know... (input astonished meme here)
I'm from EU... and the target consumer for the 380 and 390 series are gamers and i can assure you that they/we play at least the 6 hours you refer (more in the weekends).
Nevermind the GPU's from 380 down... The values in watt are lower and therefore marginal.
From where i come from (Portugal) the fact that a gpu uses 50 W more it translates in 3 € month,36€ per year. (Realistically speaking and acording to my own math and real usage monthly).
The thing is... that i wasn't talking about diference between a R9 290x Vs GTX 980 witch is in fact about 50 Watt.
I was talking about the 290x equivalent... The GTX 970, and the diference is about 80W (less 80w than the TDP of a 290x):GeForce GTX 970 | Specifications | GeForce Witch makes it as good (at least) and cheaper about 5€ month ( 60€ year)
Now imagine a diference between 240W from nVidia GTX980 and the 375W from a 390X. Assuming that the 390X is as good as a GTX980, of course. (Will see about that)
assuming you and most of people change GPU every 2 years (keep dreaming), the r9 390X MUST be under 450/500€ to be competitive and MUST be AT LEAST equivalent to the GTX 980.
when it comes to video cards Watt consumatie, amuse me. Think how current consumption need a simple microwave or washing machine
I guess we can just wait for a couple hours/days before we have real information about the real power consumption of these cards. We can discuss the whole thing then. But I am pretty sure that the difference will not be as much as you are saying.
Also what is the calculations that you are basing your 3 per € month information on? How much wattage and for how many hours? Or even better what do they charge you per kWh in your area? That would make a huge difference in adding "weight" to your statement then just putting a "x € per month more". Sorry, but for me if you dont have any real numbers going into these calculations you are just making an opinion.
According to the official information on the AMD page it looks like the R9 390X will use 300W. Not the 375W that page showed.
R9 390X specs shows it uses:
1x 6-Pin + 1x 8-pin
6-Pin supplies 75W + the 8-pin supplies 150W + PCI-e supplies 75W = 300W
Now the R9 Fury X featured on the same page would use 375W:
1x 8-pin + 1x 8-pin
2x 8-pin supplies 3000W + PCI-e supplies 75W = 375W
But again, these cards are meant to far out perform any current card, which means that you would not need the same "percentage" of power usage to run the same game specs. Basically, you would not need 100% card usage to get the same graphics performance as another card running at 100% power.
Well... I've been trying to reply for one hour at least.
But apparently i was "blocked due to potentially offensive language"... lol
I've read the post i wanted to put here a dozen times and i can't find anything that can remotelly be considered offensive.
Well... Kudos to AMD.
I was looking at the pictures on AMD earlier and the cards seem to be shorter but at least they are using dual fan coolers
I expected OEM to be different part numbers but it now seems that OEM and retail versions will use the same part numbers which will complicate resale down the road
I have updated the chart which at the bottom Hardcore Games™: Optimizing Game Performance
I will change it once more card models are known
FuryX is slated for 275Watt. It'll come out in a few days, I don't think you'll have much to worry about.