cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

SAPPHIRE - Navi XT $500, Pro $400, will have RTX 2070 & 2060 performance, NO ray tracing, NO high end cards until 2020

As we suspected with leapfrogging design teams, Navi is this year for the mid range, Vega's replacement is next year for the high end. Something I don't like seeing is the word "watercooled", so it's likely AMD's TDP will be, well, insane.

https://www.techpowerup.com/255768/sapphire-reps-leak-juicy-details-on-amd-radeon-navi

29 Replies

I guess it would have to be determined what kind of OC headroom you have with the 2060 competitor. I have a 2060 and they OC like crazy plus got mine on sale for just over $300.

It is going to be interesting to see what these navi products do actually do in the real world. I don't think that lack of ray tracing is going to be big deal if you are truly in need of a card right now as many gamers wont care as few games use it. But if I was on the fence and could hold out buying a card, the lack of this feature would make me wait a year if I didn't NEED a card today.

0 Likes

I think ray tracing will be the deal breaker if you have a modern card, but if you're the user of an R300 series or GeForce 900 series, they will be more interesting, though you will have to consider how fast AMD dropped some cards from receiving meaningful updates, namely the R300 and Fury series.

It's kinda one of the things I am scared of we saw how fast the dropped the previous architecture when GCN came on the scene. When the true new architecture hits next year assuming it's good, I would not be shocked if the whole GCN line quickly got left behind. It is pretty obvious the are not capable of catering to a large card base hence why so many GCN cards are having issues with the drivers. I can't imagine how much worse it could be if they had to support 2 architectures at once. Unless they really are not that dissimilar?

0 Likes

Navi is RDNA based, which will "coexist alongside GCN" according to AMD, but considering the much increased power of Navi, being about 50% faster than Vega 56 for the highest model announced, the age of GCN, and the strong need for AMD to be competitive with nVidia in that segment, I doubt they are that similar and that GCN will continue to receive updates at least as far as performance goes, though it's a lot easier to swallow losing support for a midrange card after three years than it is to lose support for a high end card after fewer than two. It's also why I expect AMD to officially retire support for any card older than the RX 400 series shortly after Navi launches, as the R300/Fury down to the HD 7000 series have not received any performance improvements in some time, just new drivers to keep Steam and some games happy to prevent "You cannot play this game as you do not have the most recent drivers" garbage.

It's hard to say what RDNA actually is, or isn't simply because GCN refers to both AMD's GPU instruction set, and the series of microarchitectures used to implement that instruction set.

Typically, as new instructions get added to the GCN library, new hardware is added to the GPU to implement them.  So GCN 1.2 compatible GPUs have new hardware to execute the new 1.2 instructions, but they still have to older hardware to execute the older instructions as well.

So with RDNA, did AMD completely rework the microarchitecture to better execute the GCN instruction set?  Maybe after all the iterations it was worth while to redo the hardware from the ground up.  Or is it a brand new instruction set with a brand new microarchitecture to run it?

To me, the former seems more likely.  Navi likely still uses the GCN instruction set, maybe an updated version vs Vega, but likely pretty similar.  They may have done a complete overhaul of the microarchitecture, as elements that were originally designed for Hawaii, Fiji etc. may not be that efficient with elements that were added later.  Or, RDNA could just be a GCN design that aims to execute the instruction set with gaming in mind, as opposed to the hardware being built around professional/machine learning applications and then gaming shoehorned in after the fact.

Hopefully will know more after E3.

We'll have to wait until E3 with details about RDNA, but as for months I'd been hearing and saying that they were looking to introduce a Hardware Thread Management Engine to GCN... i.e. SMT for Graphics., well this would make a HUGE difference in terms of General Performance for Games.

As GCN has two key weaknesses with current Game Engines. 

1 • Lack of Geometry / Primitive Process Engines... this meant that GCN has exceptionally strong Compute / Shader Performance but very weak Geometry Performance, at least comparatively speaking to GeForce that uses a Disassociated Geometry Engine that they scaled based on WHERE in the Product Stack the Hardware was., as opposed to AMD where being part of the Compute Units meant it Scaled Linearly with the CU.

2 • Lack of Thread Balancing Engine... this meant that it was down to Developers, or more often Drivers / Graphics API to handle Barriers, Threads, State Changes, etc. typically meaning that even under "Full Load" typically only 50-60% of the actual Hardware would be being used at any given time. 

And in terms of Power difference between NVIDIA and AMD., keep in mind AMD has an "All or Nothing" approach where-as NVIDIA (like Intel) use a "Dynamic" approach in terms of Per Core Clock and Idle. 

The positive aspect of AMD taking this approach, is why their CPU / GPU don't suffer from "Micro-Jitter" as essentially when you NEED that Power, it's instantly available; where-as NVIDIA/Intel have what could be referred to as a "Turbo Lag" where they have to spool up over a few frames before delivering the performance required. 

...

Now in terms of Compute Workloads, neither of the above issues are … well issues. 

Thus they make for exceptionally Powerful Compute / Machine Learning / etc. solutions; but for Games, well Development Studios are a lot slower to adapt, adopt and switch. 

In-fact as a keynote Games typically are going to take about 3-5 years before adopting new Technology, where-as Custom Solutions will adopt it as soon as it's Stable for Purpose. This is why many Server Solutions currently use ROCm / Vulkan / etc. where-as we're still looking at less than a Handful out of Thousands of Games using it.

If AMD can (or has) resolve these above issues., then their Hardware instead of only showcasing the 'exception' with things like Strange Brigade in terms of Performance; will instead be capable of similar performance across the board., something that I'd argue will make them look FAR more competitive, even if their actual performance peak doesn't improve over current offerings. 

• 

As for RTX / Real-Time Ray Tracing... eh... on the one hand, sure it's a Gimmick; but on the other, it's also going to be added to the Standard Toolbox. 

Both PlayStation 5 and Xbox Two are going to be supporting some form of DXR / RTR., although in what capacity again we'll have to wait to see from what's showcased at E3.

Still for those who think we'll be seeing Ray-Tracing / Path-Tracing becoming the Standard approach for Game Engines., eh... think again. It'll be popularised, of course because it will be an EASY approach to implement and provide better visuals but honestly I've said it before and I'll say it again... Ubisoft' Snowdrop (PRT) approach is going to be the future in regards to Engine Development. 

It's more scalable, versatile, physically accurate and better suited for Hardware without Custom Approaches; that will allow NVIDIA, Intel and AMD to continue their current focus on Compute without sacrificing Gaming Performance... plus it's easier to adapt it for VR... which is here to stay, but as I said 5 years ago; probably won't be "Mainstream" as it were until Mid-2020s when there's Hardware affordable enough for the Avg. Consumer.

i.e. when we can do it with APUs / Laptops and a $100-150 HMD w/Controllers.

Until then it'll remain niche; and while we're getting there... we're still a good Generation until that's happening, but I can see THIS Console Generation (PSVR-2, XBMR, HoloLens 3) will almost certainly encourage it by the end of it's Lifespan. 

Same is true for Ray-Tracing... by the time everyone had Hardware for it., well there will be a more commonly accepted alternative (as noted PRT) that will just provide similar fidelity at better performance levels. 

Very well said and great information. With current Ray Tracing options it is an option that mostly is gimmick unless you truly can afford to play at the very expensive level of hardware required to work at acceptable frames.

Some may feel it's a gimmick or not needed. I would and don't as a RTX 2060 owner use it as it isn't worth the frame rate trade off. I am not going to play at 35 FPS when I could play at 60 FPS or better just for a more eye candy. That being said that eye candy is a lot more that pretty reflections in puddles. If you have not seen it for yourself, it really is pretty awesome and no youtube videos do it justice in representing what it does IMHO. The truth is that I would love the feature if cost in dollars and performance were not so high.

I agree with you. I think that this tech will become a standard, but not likely exactly as it is implemented today. I think it will be a future evolution of what it is now that truly becomes mainstream and will only happen  when it is something that even low to middle level cards can utilize.

I love reading your comments, always good stuff!

0 Likes

The key weakness with GCN and games historically is that AMD simply can't afford to make more than one GPU design.  When you look at NVidia, and their pro, machine learning, and gaming lines, the GPUs are physically different.  They engineer a hardware solution optimized for the software it will interface with.

AMD cannot afford to do that, so they create a single GPU design, and use it for everything.  Because the professional space has far higher margins, they design the hardware to give competitive performance in that space first, and then make that work as best they can for gaming with drivers.  That is also why there is the power efficiency disparity.  Vega, Fiji and the like have GPU dies with a whole bunch of FP64 units that use power, but aren't utilized for gaming. 

NVidia actually ran into the identical problem with Fermi.  All that compute on the card made it vastly more expensive and inefficient compared to AMDs HD5000 series, which didn't have much in the way of compute at the time.  NVidia then started fabbing completely different dies for different purposes, partially because those early Titan cards were cannibalizing their own pro sales.

AMD has never designed a GPU microarchitecture in the GCN era where gaming was the primary focus (or even a total afterthought as in the Radeon VII).  And that is possibly what Navi represents.  If the hardware is laid out to execute gaming code, I'm sure the GCN instruction set will need updates as the underlying hardware may look vastly different.

As for Ray-tracing, this has always been possible via standard compute without the need for specialized hardware.  NVidia wrote a paper to that effect back in the Kepler/Fermi days.

https://research.nvidia.com/sites/default/files/pubs/2012-06_Understanding-the-Efficiency/nvr-2012-0...    

amdbooger
Adept II

I think RTX is a bit of a gimmick.  Like PhysX or Hairworx.  I do not see it being mainstream in gaming for years, that is, if it EVER does become mainstream.  The performance hit just to have pretty puddles is not worth it.  In PC gaming, it is more about FPS/refresh rates than it is HDR and pretty puddles.  Most gamers, imo, would choose FPS over visual eye candy.  If they could find a way to utilize ray tracing without killing performance, or get games to actually include it... then it would be a viable technology for gamers to really care about, let alone invest in.

 

We have a 2080 and 2080ti, and you literally have to stop and stare at a puddle to see this feature.  If you are running around, you know, actually playing the game...you do not even notice it. 

 

As Lisa Su said, Nvidia included RTX just for the sake of being able to lay claim to a "new technology" with the RTX line.  A selling point, or, dare i say: A GIMMICK.  Yes, indeed, just a gimmick to entice gamers to "upgrade" to the RTX series, because other than the 2080ti (which is a luxury item at over $1,100 USD), there was no significant improvement in performance.  Gamers who purchase high-mid to high end cards had little reason to even think about switching to RTX, as the 1070,1080,1080ti and Vega line all still offered a solid performance//price value  VS the RTX line, up until the 2080ti, which is financially out of reach for most gamers.

 

So, what was Nvidia to do?  How do we get gamers to "upgrade" to our RTX line when performance alone is not really all that compelling VS. our 10 series or Vega??  

AhA, thats right!!  RAY TRACING!!!  Behold the beauty, the FUTURE of gaming!!!  A must have for ALL gamers!

 

As the slick Nvidia marketing machine often does, it has convinced gamers that this is the "future", and a must have technology (it is not).  All this, while it is barely supported in any games, and the majority of future titles do not even plan to support it. 

 

DLSS is a feature that i don't hear talked about much, not nearly as much as ray tracing...and unlike ray tracing, it is actually a very interesting and promising Nvidia technology.

every-better-tus-cam-gamer-ray-tracing-right-now-it-kinda-41710442.png

0 Likes
qwixt
Forerunner

Just think of how much silicon is dedicated in the RTX series to not have useful ray tracing? About the only card that comes close is the 2080 TI. Ray tracing is a checkbox only feature for now.

Also, the only card that AMD doesn't compete against is, once again, the 2080 TI. 

Radeon VII == 1080 TI == 2080 

For the most part. Sure in some games, AMD really tanks due to game optimization, or the game architecture is simply suited better for nvidia. Then a couple other games AMD comes out on top due to the game needing more memory bandwidth.

Now when you look at the 2080 TI, in general, you need a higher resolution to take advantage of its capabilities. It's has issues separating from the 1080 TI at 1080p. I think that's due to CPU bottlenecking. So that means you need 1440p or higher, and according to steam stats, that's really a small niche portion of the general market. I am guessing most of us here fall into that niche category, but then we don't really represent the bulk of the market.

0 Likes

An ever growing niche market as UHD screens fall in price, although "gamers" won't use anything other than 240hz screens no matter how poor the viewing angle and color reproduction, and to get 240FPS in games, especially with decent detail levels, you need something the power of the 1080Ti. Aside from that for playing games at high details at UHD, a card of the power of the RTX 2080 is required, or 2070 if you're willing to compromise on details.

0 Likes

And me being a graphic artist I am content with 75 fps and ACCURATE COLOR. There are still a rare few of us that are just fine gaming at that level. It sure is cheaper to do so too!

0 Likes

And me being someone who appreciates details and consistency, I'm happy with my LG UHD 8+2 bit IPS monitor.

0 Likes

yah mines a 10 bit IPS. And I am very happy with it. It games just fine for me and is good for my Adobe apps. 

0 Likes

I have the Acer XF270HU monitor and run it at 144 Hz.  The color has to drop from 10bit to 8 bit at 144Hz, but the color reproduction is still fantastic.  I calibrated the display with an i1Display Pro and the DisplayCal software, the results are shown below.  The delta(E) is how far each color measures from nominal when reproduced on the screen.  The performance of this display even at 144Hz and 8bit is rock solid.

pastedImage_4.png

It is pretty tough to get above 144Hz without ghosting on IPS and the luminance of the colors to fade.  But for me, I would rather have accurate color as well vs the 200Hz+ refresh.

0 Likes

Looks like a nice monitor. Mine is an Acer too but I guess you save a bit limiting to 75 hz mine is a 1440p 32" but was only 200 on sale. 

0 Likes

Mine. Got it on sale last year,

Haven't made the jump to 4K yet myself, but with the Radeon VII installed I could probably make decent use of one at this point.  I do max out the 144Hz of my current 1440p display in some titles, but with a 40-144Hz Freesync range, I feel like I'm sitting right in the sweet spot.

0 Likes

Oddly it did take some getting used to coming from the more square 1920x1200 (16x10) monitor I had before. I wouldn't have changed but the backlight was starting to age to the point where it would cause migraines.

black_zion wrote:

Mine. Got it on sale last year,

 

pastedImage_1.png

I have the LG 27UL500-W which is IPS and HDR10 compatible and it makes games look good. Mine is the 2019 model which has some nickel and dime improvements.

  1. LG 27UL500-W
  2. 27 inch diagonal
  3. IPS 10-bit per color 
  4. 1.06 billion colors
  5. 3840x2160 UHD
  6. Freesync compatible
  7. etc
0 Likes

For an IPS monitor, I was amazed at how accurate the color reproduction still was at 8-bit 144Hz, vs 10-bit 60 Hz.  10 bit 60Hz was still better, but overall IPS can deliver decent refresh rates without sacrificing too much in the way of accuracy.  But yes, as you say, that is usually only on the higher end panels. 

0 Likes

Unfortunately for a lot of the work I do it isn't just the color that is the issue. I deal with a lot of adjustment to images for Flexographic printing. The 8 bit panels don't display vignettes or highlights anywhere near well enough like the 10 bit do. It is enough that it shows the color break differently and not really where it is. So yes some are pretty good these days at just color but there is more to it than just color also.

0 Likes

Hmmm, I guess I could calibrate and test my display in both 8-bit 144 Hz mode and 10 bit 60 Hz.  With a calibration profile for each, you also get sRGB coverage, contrast ratio etc beyond just color accuracy.  I wonder if there is much of a difference on any level.

0 Likes

pokester wrote:

Unfortunately for a lot of the work I do it isn't just the color that is the issue. I deal with a lot of adjustment to images for Flexographic printing. The 8 bit panels don't display vignettes or highlights anywhere near well enough like the 10 bit do. It is enough that it shows the color break differently and not really where it is. So yes some are pretty good these days at just color but there is more to it than just color also.

IPS panels have long offered 10-bit per color which has been called Deep Color before the HDR marketing surfaced with windows...

0 Likes

Yes that is why I use them.

0 Likes

Biggest advantage of IPS is the consistent viewing angle though. My old Samsung 24" TN display had an insane color shift top to bottom, so bad you'd lose the ability to see the skill bar in a MMO. Then I got my Dell IPS monitor and although it was the same size and resolution, and both 8 bit panels, the IPS's quality just night and day difference, about as different as this 8+2 bit monitor has compared to that 8 bit.

That color shift makes me think about nVidia's "cheat" shadow adjustment to reveal players hiding in shadows by tweaking the black levels. IPS users don't need that trickery, we can see, literally, all 256 shades of grey in the 8 bit space (or 1024 shades of grey in the 10 bit space).

Yes you can turn the brightness up a touch higher than the recommended settings in games and see people sneaking up on you!

0 Likes

Not to mention the gamma slider. How many games over the years have been cheesed by the gamma slider...

0 Likes

I have adjusted many games gamma as they were too dark for a UHD panel to play on

0 Likes