September 1st. marked the Presentation of the new RTX Cards.
For me as a long time AMD user it was really cool to see what Nvidia has done with RTX and AI.
Yes, they are on that Samsung 8nm Process and yes the cards consume a lot of power and they are not cheap like always with Nvidia.
I don't know but it seems like AMD has a lot of work to do. I am not happy with the removal of FRTC and i recently had a Black Screen "again". They have done some fine work and implemented some futures in to the Driver. Picture Quality is really nice when i compare to Pascal.
I hope AMD/Radeon have something fine in the works with Navi2x. I will wait to compare the new GPU's but i want to say that i was really impressed with RTX 3000.
What do you think?
Lineup has some big gaps that should be filled. Disappointed not seeing display port 2.0. 8gb for 3070, 10gb 3080 seems somewhat on the low side for next generation which then balloons up 24gb. Concerned about the dense high power assembly on the reference models, putting a lot of power in a small space must be tricky or it can get ugly. Hot air from the reference models will blow right into an air CPU cooler, could degrade performance of the PC, maybe not a significant issue but look forward to test analyzing that aspect.
The suggested performance increase looks good but not really confirmed, no game play shown with actual FPS. 3090 three slot solution is a non starter, EVGA AIO and water block solutions are pretty cool except no known release date for those. Claim of 1.9x efficiency increase yet Jensen says the 3070 (a 220w card) is performance equivalent to a 2080Ti (a 250w card) how the hell does one do the math for that kind of efficiency increase? Marketing BS, unclear wordy misleading at best statement. Maybe at idle the 3070 fan stops while the 2080Ti keeps on.
Really need the cards out there for real evaluation to understand more the pluses and negatives. I would not mind replacing my 1080Ti's with a 3080 but am hoping AMD has also some very viable next generation cards coming. My other machine with a 5700 XT AE is sweet but will not be powerful enough when I upgrade the display to a higher resolution.
I like the somewhat more reasonable pricing on the cards, at least on the 3070 and 3080. The 3090 seems bloated not only in size but cost, does have 24gb of DDR6x but not clear if it has a cache controller to extend that amount like on the Vega's HBCC making rendering options much more versatile, still a lot of ram there.
I am hoping AMD has a card that competes and basically beats a 3080, 2 slot at most and prefer a higher version that has liquid cooling. The Vega 64 LC design was awesome. More ram as in 12gb-16gb, less power usage. Display port 2.0 in addition to HDMI 2.1 -> that in itself would almost seal the deal since I keep my cards a rather long time and future monitor options, VR headsets etc. come into play.
AMD and software, they did some cool stuff in the drivers, Performance Tuning and Profiles are awesome, monitoring. Would like to see AMD Link Server be able to use the phone camera and mic and options in the Streaming editor for multiple cameras. Links for each section in Radeon settings for professionally done videos explaining the choices, get the youtube pro's to help out (pay them for their service)(JayZ running down Performance Tuning. Hardware Unbox game profiles. Game Nexus Web and Link software . . .). Nvidia Broadcast using AI and beyond type program in the drivers would be very useful.
AMD needs something like DLSS, call it MLSS (Machine Learning Super Sampling). DLSS is a very effective reconstruction technique not only increasing IQ but performance as well.
I think I will see what AMD options are and hopefully will not have to wait that long. Also if AMD has a better supply, they could be the one selling more this year than Nvidia if they do not.
The real question in my opinion is: in which extent AMD bet on consoles VS computer gaming.
Because Nvidia is pulling out nice stuff from both hardware and software side.
And i'm not sure AMD have the capacity to go much further than what's Nvidia proposed.
Especially because from some time now AMD is powering both console and computer gaming.
In my opinion, slowly drifting more and more toward console development.
Also note that Nvidia has been a pretty cleaver girl, releasing its gpu before console launch!
Fortunately for me i skipped AMD gpu for the last 2 generation and i'm actually running a 1080 under water.
But before that i had only AMD/ATI gpu, the first being a ATI Rage128Pro then a 9600Pro!
And as it is now, i do not advise anymore in any shapes of forms any AMD gpu's.
Nvidia RTX3090 3080 3070 just killed Radeon GPU lineup.
Even without this launch Radeon has been failing because of bad AIB GPU Card quality and bad drivers.
Lost 9% Market Share in discrete GPU since RX5700XT launch.
I think Big Navi is dead in the water before it releases based on what I have been seeing reported about it.
AMD were projecting 1.5x Performance / Watt improvement.
Nvidia claim 1.9x.
I think Navi 10 RX5700XT needs a price drop to 250 to stand a chance of selling as it has now dropped way down GPU performance hierarchy.
I expect delay in the launch of Big Navi until 2021 now and I think AMD will have to jump to using HBM2e to reduce power consumption of the memory and memory controller and increase performance. They can always blame "COVID-19".
I hope AMD do something about the Adrenalin 2020 GUI/UI, AMD Bug Reporting tool, Installer.
Big changes are needed at Radeon.
I have been using and supporting AMD GPUs for a long time but things are beyond a joke now.
Maybe I should not have responded as I am not a "Red Teamer".
Thank you all for your comments! I really enjoy this sort of discussion.
I don't have Twitter or any other Social Sickness ; )
I have to say that i really enjoy Ryzen! And i hope that AMD can do the same with Radeon.
The First Step to a better Future was to separate Gaming and Professional GPU's.
RDNA2 will be on a more mature 7nm Node.
AMD hired more Personell esp. for Raytracing and GPU Dev.
They will bring a whole new Line up from the bottom to High End.
I think AMD will also bring something like DLSS.
It's a good thing to have Consoles with your Hardware and i hope they will benefit from it.
They could really do things that Nvidia near Intel can do with that Ecosystem.
Zen 2 Ryzen 3000 series is excellent.
Zen + Ryzen 2000 series is very good, provided you become good at tweaking VBIOS, especially RAM timings.
I am primarily looking forward to Zen 3 launch, lots of interest for new AMD PC builds.
I will watch what RDNA2 brings with interest.
The real first steep was the come back of AMD into OEM market.
Which mean having finally partners that help you develop the product they need.
Thus selling more professional cards and cpu, enhancing the product development.
RDNA2 will come on the N7+ node, until now all the cpu, gpu were build on a N7, N7P node without the use of EUV.
N7+ is the first node to really use EUV and could have the same yield issues we leveraged with the 3000 series launch.
N7+ is not N7P compatible and require a re-implementation of the IP, bringing 10% clock speed at same power or 15% less power at same clock speed.
Ryzen XT cpu are a good example of the N7 to N7P process evolution with yield and binning changing over time!
I'm not happy that AMD is now powering both worlds, because i fear AMD doing a Star Citizen.
Getting money from one side to enhance the development of the other side project because being more lucrative.
side note: I started to write an article about photolithography nodes, but stopped because too long and delicate to explain.
I think it's too expensive for the majority of buyers, and thus market share of these products will be small ~ 11% judging by the steam hardware survey market share for GPUs exceeding the 3070's launch price.
AMD could use their chiplet expertise to focus on volume production of small and cost effective "shader" dies and on chip integration of multiple shader chiplets to scale through mid and high end GPUs as per Ryzen. With their console order book, they could significantly undercut nvidia's production costs. Then they just need to execute on software, drivers, and brand appeal to increase market share on the PC.
RTX IO was interesting in that it introduced a problem i wasn't aware existed. Given that this technology is onboard the Xbox Series X and PlayStation 5, i'm sure we will be hearing from AMD about their implementation for PC gaming.
You do have to take what NVidia publishes with a grain of salt. The performance improvement of the RTX 3000 series rests highly on the improved FP32 throughput of the CUDA cores. The 8704 CUDA core mark only relates to the RTX 3080 in FP32 128 mode. If there are any INT32 instructions to execute, then the GPU has to run in FP32 64 + INT32 64 mode. That gives it 4352 functioning CUDA cores for both FP32 and INT32, exactly the same as the RTX 2080 Ti.
So in a lot of workloads the performance increase will just be down to clock speed increases and efficiency gains from the new process. Memory bandwidth is also only 15% or so higher than the RTX 2080 Ti.
Having said that, NVidia is worst case releasing a 2080 Ti with higher clocks, more memory bandwidth and more efficient cores for $699 vs the $1,199 of the original. And in some FP32 bound workloads that cheaper GPU can see an even greater performance increase. So if nothing else, it is a win on price alone. Although this is effectively moving the flagship card price point back to the $700 mark that the GTX 1080 Ti occupied. So I guess less a win and more undoing Turing's fail.
Beyond that, these are still big/power hungry GPUs on a new process. So volume will likely be extremely limited at launch. AMD could have a chance to take market share if it can deliver more cards to the channel with the more mature TSMC 7nm process, as long as RDNA2 is priced competitively. However, the recent ethereum boom really favors RDNA GPUs, and if RDNA2 is also and excellent mining GPU, AMD may lose their cards to miners as opposed to gamers.
You make some extremely compelling arguments. Thank you for such a through break down and I agree with you, I am now running my Vega 56 as my 1080ti degraded and does not perform well. It is now in my streaming machine as a encoding processor for recordings while I use a 1950x for the stream encoding. I hope AMD video cards get better on board encoding that will help me kick team green completely.