5 more months of praying my Fury doesn't die...
The actual "roadmap" graphic doesn't show Navi at all.
The "official" roadmap shows Navi in 2019 ever since AMD changed it from 2018.
There are and have been more than a few reasonably priced good graphic cards out there since bit coin crashed. Why wait for Navi?
Because there aren't really any reasonably priced graphics cards if you want to play games above 1920x1080 (and some games at full details at that resolution)? Vega 56 is $400, Vega64 is over $500, RTX 2070 is over $500, RTX 2080 is over $700. Honestly until Navi comes out, the only real solution may be the "Leak" - nVidia GTX 1660Ti - 5% slower than Radeon VII, but $280, which is scary.
The Vega 56 Nano is $350, not a bad deal really.
""Leak" - nVidia GTX 1660Ti - 5% slower than Radeon VII, but $280, which is scary"
You really believe that is going to be a solution? That is actually a massively disingenuous post if you ask me. A better title would be, GTX 1660 Ti. GTX 1070 performance for $280.
It is worth pointing out that the GTX 1660 Ti is slower than the GTX 980 Ti, a GPU that typically performs very closely to the R9 Fury X. So in most titles, this GPU will land likely between the Fury X and Vega 56, depending on the title.
The solution is Navi coming to market and providing competition so prices drop to reasonable levels again, since nVidia is gouging which is typical given the lack of competition. As for how the 1660 Ti performs all will be revealed, supposedly, by the end of the week.
Also you can't compare the Fury series performance in games anymore, since benchmarks are showing how the mid range RX 580, despite being a slower card, is outperforming them. I'll use the oft quoted Forza 4 benchmark. The RX 580 is faster than the GTX 1060, as expected, but it is also faster than the superior Fury X, a card that's, by the numbers 39% faster, 22% slower.
Navi could also come to market being slower, more expensive and consume more power than it's Nvidia counterpart. Like Vega. What then? Wait another two years for the next AMD release?
Vega was Raja's screwup, one AMD couldn't fix because 75% or more of RTG resources are dedicated to the custom market, of which Navi is a major part of, so they're taking their time with it. Navi is going to power the next generation XBOX and Playstation, both of which are targeting the 4k60 market, so it is going to have to be powerful and efficient. We know from previous statements from AMD/rumors/blather that the first Navi cards will be the midrange cards, as they will be derivatives of what's in consoles, and "leapfrogging design teams" means that this is the midrange year, and will be targeting GTX 1080 levels of performance for around $250. That's not exactly slow, and likely will be a case of Navi vs the GTX 1660Ti and a fairly even matchup. Navi will be GDDR6 based, so it's not going to be handicapped by insane HBM prices. Saw a report today where RamXchange expect RAM prices to drop by near 50% by the end of the year, a good omen that by Christmas (should TSMC's chemical screwup not slaughter production too badly) even I, the most pessimistic person on the planet, foresee $200 Navi and 1660Ti fights by Christmas. Power draw is a question mark, reports say that nVidia's 10nm and AMD's 7nm processes are quite alike, but considering all the time spent on design and refining of Navi (AMD's had cards since last half of last year in their labs) and the debacle of Vega, it would be hard to imagine these cards pulling over 150w, not much more than nVidia's cards but not in the realm of insanity that Vega is.
So while nVidia may be uncontested in the ultra high end until next year (with cards people aren't exactly flocking to buy), AMD should have far more ammo to throw against them than they do now. Also remember AMD is banking -a lot- on this release with Zen 2 matching or even, if AMD's benchmarks are to be believed, exceeding Intel, and PCIe 4.0 capable X570 motherboards, they're going to be aiming Navi to make huge waves as well, especially if they do not feature ray tracing.
I don't feel that Vega was a "screw up" really. AMD needed a GPU to help crack into the machine learning markets and academic data analysis stations, beyond the standard pro rendering. Vega does all of these things, plus, it also triples as a gaming GPU. The fact that AMD can compete in three market segments with a single GPU, segments where NVidia has produced specific GPUs is nothing short of miraculous. The downside is, Vega isn't as great at gaming as it could have been. If there was any misfire there, it seemed like some of the touted features haven't been implemented to this day. Like primitive shaders, whatever happened with that?
It's still a debate to if Vega's failure was due to Raja's choices or Lisa Su's RTG allocation decisions, don't think there's ever going to be a definite answer to that (I blame Raja), but there are multiple problems with Vega. Inefficient, incomplete design, reliance on HBM2 which is necessary in high level professional applications but not in anyway necessary for games, and, most of all, lack of deep learning AI features nVidia included, which aren't in the silicon and can't be added. Vega's like a worse version of the Titan. Yes you can game on it, but it's not really built for gaming, like a CPU flogging away at x85 PhysX instructions, it works, but not great, plus the reliance on HBM2 meant it was always not going to be competitively priced in the consumer market. If not for the cryptocurrency market Vega GPUs would be covered in about a foot of dust in the corners of every warehouse still.
There were problems sure. Some of Vega's highly touted features like NGG fastpath, DSBR, and primitive shaders were all highly touted as Vega features that would improve its gaming performance and efficiency. To this day, I don't think any of these features have been implemented. Some of the transistor budget was used to add these features to the silicon, but something clearly went wrong with the implementation. Not sure who's fault that is, but every Vega essentially had wasted transistors that were used for nothing.
I wonder if Navi has implemented NGG, DSBR, or primitive shaders? Or if AMD abandoned those, and the features didn't even make the silicon.
AMD never received support from the APIs (DirectX 12 and Vulkan) and game developers for primitive shaders, so there was no one-click option to turn that feature on in the drivers. You can blame Raja for the rest. Looks like Navi will support them though at least in Vulkan (AMDVLK), no word I've found that it will work in DirectX 12 yet.
Interesting. The worry some thing is that NGG and DSBR, are labeled as features for "next-gen". Is that next gen as in the next generation of GCN, or next gen as in the next generation architecture? I wonder if there is a fundamental problem with GCN that makes implementation difficult. It is extremely strange that they were marketed heavily for Vega and we're never implemented. Those features are a big part of Maxwell's performance and efficiency gains over Kepler.
Retrieving data ...