or trading for NVIDIA cards.
I need drivers who works and AMD doesn't have.
Might want to check the nVidia forums before you do, it's not all smiles and rainbows over there.
$1 bid to start you off.
Good luck, i can give u 5 dollars.
darn you are cheap! I will offer $5.01! Seriously though what does he have? He didn't say! Gonna be hard to sell or trade it.
He is just a trolling guy, with his computer full of bloatware and crying about the driver.
Just the usual stuff.
I want to do the same thing.
What are you thinking of selling?
You can try and sell your AMD Card on Ebay or Amazon Marketplace.
Example: I had an AMD HD 6850 GPU card and upgraded to a AMD HD 7850 GPU Card. Didn't need two cards so I went to Amazon.com and opened an account to sell my HD 6850. Within two days I had a buyer in which I sold the card too.
I'm trying to selling my cards but nobody wants AMD. That's guys preffer 1050ti before RX570
advertise you AMD GPU card with the price that you want for it on Amazon Marketplace. I am sure that there are as many buyers looking for AMD products as there are for Nvidia.
Here is Amazon FAQ about selling products on their website: Sell Your Stuff at Amazon.com
I guess you are excited by the RTX 2080Ti, 2080, 2070 etc.
I don't blame you.
It looks like it will be the most interesting GPU launch since the GTX 1080 ~ 2 years ago.
Those cards are still beating RX Vega 64 AIB cards, run at lower power, overclock better, have smaller form factors.
I am sure I will get flak for saying that and people will say "DX12", or "Vulkan" - well, there are not many of those games out there.
Nvidia have caught up with Vulkan Performance based on last set of benchmarks I see and ran.
Nvidia are also catching up in OpenCL 2.0.
I will be watching the Nvidia Event on Twitch in 17 hours: Twitch
I really do not want to spend my money on Nvidia GPU's but they have better cards even with their existing 1000 series and the Nvidia cards I run and test have better / more stable drivers on Windows and Linux.
I think AMD were very lucky that the existing discrete RX Vega Cards are very good at Ethereum Mining.Right now it looks like some retailers can't give the RX Vega 64's away ... PowerColor RX Vega 64 Red Devils are now selling for 400-450 new with over 100 worth of free games. I was tempted to buy one at that price but then where will I fit that thing. It is 2.75 slots high. It blocks other PCIe ports.
I am better off sticking with a pair of R9 Nanos.
It is even too big and takes too much power too to fit in PowerColor's own Thunderbolt 3 eGPU box.
Maybe the price is tanking because of the Nvidia RTX card launch, maybe people are hanging back to see if GTX1080Ti prices start to drop. Really AMD need to do something to get back into being competitive in high end gaming.
I think the last time they were, was with the R9 Fury X / Fury / Nano cards which is where I have stayed.
The GTX 1080Ti is already 30% faster than an RX Vega 64 AIB card. I have not seen any GTX2080Ti or 2080 benchmarks leaked yet, however just based on the reported additional CUDA cores and clock speeds alone, I think they might push RX Vega 64 down into mid range performance in comparison.
Nothing like giving a glowing positive report on AMD's competitor's GPU Cards in a AMD Forum
just teasing you
I am only giving honest feedback based on what I see and I experience.
I really do not want to spend my money with Nvidia.Not because of problems I have had with their GPU's.
It has got to the point now that staying with AMD Cards has hurt the project I am working on.
Even if I had waited for the late RX Vega launch I think it took until March 2018 year before AIB Vega 64 cards had proper working drivers and BIOS. I am really hoping that AMD can launch a Vega 64 on 7nm with better HBM2 soon.
I do need some more modern AMD GPU than R9 FuryX/Fury/Nanos, so I am looking to purchase one Powercolor RX Vega 56 Red Dragon if the price is low enough. Looking at benchmarks they are not much faster than an R9 Fury X. However they do have 8GB of Vram and High Bandwidth Cache is of interest to me. Also some new architecture features are of interest to me for programming reasons. If I could still buy an RX Vega 64 Liquid card new at a reasonable price I might.
I am not purchasing any more second hand GPU, even though R9 Nanos are now selling for £/$/Euro 200 - you might want to take a look at those yourself.
I am keeping my AMD cards FYI.
"PowerColor RX Vega 64 Red Devils are now selling for 400-450 new with over 100 worth of free games."
Where is this?
This may be of interest - if it is true: PNY accidentally publishes RTX 2080Ti XLR8 Landing Page - all for $1000 USD | ProClockers So based on that, lets see if I can predict out how much faster that card could be versus a GTX1080TI, then we could work out where that will likely stand versus an RX Vega 64 Liquid, which is faster than any RX Vega 64 AIB card but ~ 30% slower than a GTX 1080Ti.
OK so getting back to this PNY "Leak".
4,352 CUDA Cores,
11GB of GDDR6 on a 352bit bus,
Real Time Ray Tracing.
$1000 US Dollars.
Memory Bandwidth 616 GBPs
It is a 2 slot high card.
OK I think this new GTX 2080Ti will be up to 21.5% faster based on Cuda Core count assuming perfect scaling.
The GTX 1080Ti TDP is 250 Watts, this 2080Ti is slightly higher at 285W, so depending on the cooling solution the core clock may well run slower/ lower than GTX1080Ti.
The PNY card shows core clock of 1350Mhz and boost Clock of 1545 MHz.
Many GTX 1080Ti aftermarket cards run higher than that.
The GTX 1080 Ti reference clock is 1480 MHz and boost clock is 1582 MHz
So downscale due to lower core clock and boost clock.
Downscale on pessimistic side based on core clock value = 0.912
Down scale on optimistic side based on the boost clock value = 0.977
That makes the GTX 2080Ti 10.8% faster than a GTX 1080Ti based on the pessimistic boost clock value (that low boost clock may only be needed if RTX /Tensor cores are being run/powered up). But let's stick with 10.8%. faster than GTX 1080Ti.
I will think about the likely overall performance improvement based on better GTX 2018Ti memory bandwidth of a factor of 616/484 = 1.27x improvement.
You are so smart.....
Here is my guestimate on how much faster the RTX2080Ti Will be.
Cuda Cores Factor = 1.214963707.
Pessimistic estimate using lowest GTX2080Ti core clock = 0.912162162
Pessimistic estimate for performance increase due to higher memory bandwidth = 1.05072727273
Performance improvement versus a GTX 1080Ti = 1.164462113
Performance versus an RX Vega 64 Liquid = 1.513800747 times faster.
So I reckon that the GTX 2080Ti will be at worst, 51.4% faster than an RX Vega 64 Liquid.
OH yah, well what's the air speed velocity of an unladen swallow?
Don't know ...
Are you watching the Nvidia event? --- They are just talking about the 2080Ti now.Still don't know how fast it is yet ...
No, Gotta link?
Erm ... not sure I should do that.
Last time I pointed to it above, this thread got locked and I could not see it or access it. I thought I was on the naughty step.
The event is on Youtube and Twitch.
There were about 12.5 Million people viewing it on Twitch which is pretty amazing. It just finished.I did miss the start of the presentation but it is heavily discussing the RTX (Ray Tracing) and the AI blocks on the GPU. They have come up with a new Performance Metric based on how many ray tracing operations can be done on the new 2080Ti versus previous 1080Ti and below generations. No mentioin of AMD cards from what I see, however it is strange for Nvidia to launch the 2080Ti along with the 2080 so maybe they worry about new AMD cards on the way.
I guess we will have to wait for benchmark measurements to see how fast these cards really are.
African or European?
Anandtech just posted this: https://www.anandtech.com/show/13249/nvidia-announces-geforce-rtx-20-series-rtx-2080-ti-2080-2070 … It has some Single Precision Perf data. The RTX2080Ti = 13.4 TFLOPs. The GTX1080Ti = 11.3 TFLOPs. That is an 18.58% performance improvement, which is not far off my pessimistic16.4% estimate which assumes card runs at 1350MHz core clock.
GTX 1080Ti in Comparison.
3584 CUDA® Cores
11 GB GDDR5X on a 352-bit bus.
1582Boost Clock (MHz)
11 GbpsMemory Speed
484 Memory Bandwidth (GB/sec)
Don't take this the wrong way, but maybe you should be posting all this data on the RTX 2080 at Nvidia Forum. Since most or all the Users there have Nvidia GPU cards and might be interested in upgrading later on.
By you posting there concerning your research on the RTX 2080 it will give those Users an idea of the Specs of the new GPU card.
By posting here at AMD Forums you might change some upset AMD Users to decide to switch Nvidia GPU cards which is not a good thing for AMD based on your positive review.
Just my own opinion.
I realize that you are just giving your opinion of the new Nvidia GPU card, but it sounds more like an advertisement for Nvidia at the same time.
I have not been doing a review, I am just interested in seeing where these new Nvidia cards are at versus AMD Vega 64.
NVIDIA Architecture doesn't linearly scale due to how it works., the same is true for Intel Architecture but they're very close in basic design.
Now as a key point-of-note:
Frequency x CoreCount x 2 (Ops/Cycle) = Floating Point Operations / Second
1582 x 3584 x 2 = 11,339,776 MFLOPs or 11.34 TFLOPs [GTX 1080Ti]
1542 x 4352 x 2 = 13,421,568 MFLOPs or 13.42 TFLOPs [RTX 2080Ti]
This is a +15.5% Peak Performance Difference.
Of course is we compare Base Clock., we get 10.61 TF Vs. 11.75 TF... which is only +9.7% Baseline Performance difference.
Keep in mind that GTX 980Ti to GTX 1080Ti was a 50-55% Performance Difference.
We can only assume how well this will actually Clock... remember that it is possible to coax 2050MHz Boost Frequency out of a GTX 1080Ti AIB.
Yet remember we're talking about a balanced Heat Dissipation Profile., so while 12nm (14nm++) does provide the potential for Frequency Improvement (as showcased by Ryzen 2nd Gen), to achieve this AMD didn't change the Die Layout or Dissipation Profile.
In fact, I'd wager that the Higher (OC) "Founders Editions" that also cost more... are likely to be very close to the Maximum Overclock.
Keep in mind here that while the AIB "Stock" is 1350MHz Base / 1545MHz Boost while 1635MHz FE Boost.
Of course there is an argument to be made here that this doesn't mean anything given the GTX 1080Ti is capable of up to 2000MHz over the (effectively 1600Mhz Stock Boost) … but in this case I'm not so sure, as the Titan V tended to tap out at 1780MHz and bare in mind that these were heavily binned chips; as there was very little variation between Review Frequencies or those on the Futuremark Database.
Remember that the Titan V, has a very similar Boost Clock.
As such I'd be surprised if you can get more than 100-150MHz Stable OC on the RTX Series.
The result being is we have the £1,100 RTX 2080Ti [14.73TF] Vs. £670 GTX 1080Ti [15.62TF] … and here's the thing, if we actually compare the potential performance of the £750 RTX 2080 [10.60TF] and £470 GTX 1080 [8.87TF] or £570 RTX 2070 [7.88TF] Vs. £380 (£420) GTX 1070(Ti) [6.46TF (8.19TF)]
Well it just ends up looking worse and worse.
Especially given *all* of the above 10-Series are capable of hitting 2000MHz, very noticeably improving their performance.
There is absolutely Zero Guarantee (like Volta) that Turing will actually clock too far beyond it's Stock Boost.
How does that stack up against the RX Polaris or Vega (GCN 2.0 Architecture)? Well, given that the RX VEGA 56 is effectively on-par with the GTX 1070Ti in *most* titles and the RX VEGA 64 is effectively on-par with a 1080 with Overclocking again in *most* titles. There will be select titles, such-as those using Epic' Unreal Engine 4 (which is heavily co-developed / produced by NVIDIA today), where sure... AMD Graphics just gets thrashed.
Still in most other Scenarios it holds it's own.
And with these Cards *finally* back to MSRP in *most* Regions., this also means Price-to-Performance they're really a coin-flip decision between the NVIDIA equivalents. This means performance wise they're also going to be relatively competitive with the RTX 20-Series as well., and what's more important to keep in mind is that while the GTX 10-Series struggles (but at least is capable of using) Radeon Rays (unlike RTX, which requires specialist Hardware) it doesn't on AMD Hardware... *ANY* GCN 3rd Gen or Better, which is what most people have.
Navi almost certainly (going by the patents filed) is also going to have notable acceleration not just for Radeon Ray workloads, but Machine Intelligence as well as Brute Force Traditional Pipelines... i.e. we could actually see Navi return Terascale 3 DirectX 11 performance to the GCN Architecture., without compromising the Asynchronous Compute.
NVIDIA might be betting heavily on Real-Time Raytracing., AMD isn't... and frankly this makes more sense, but when you bet on "New" methods; you then have to convince Developers to actually adopt it., which history shows, they don't. Not unless you pay them... A LOT.
I might see about putting my nose to the grindstone with Radeon Rays over the next month or so, see if I can't coax my RX 480 to achieve some comparable results to what NVIDIA RTX is showcasing because frankly I actually believe it's more than possible to achieve some similar results... and would be fun to achieve what NVIDIA will no doubt try to claim is "Impossible" without their Revolutionary new Hardware.
Not to whom I was replying to and frankly the base topic of "I want to trade my AMD Card for an NVIDIA Card" on the AMD Support Forums is a pointless topic as there will simply not be any takers... even if they're Performance Cards unless he's willing to trade down, then he's simply not gonna have any takers.
Where-as a comparative against the "Next Gen" NVIDIA, is actually somewhat pertinent and likely more interesting to those on these forums.
"and would be fun to achieve what NVIDIA will no doubt try to claim is "Impossible" without their Revolutionary new Hardware."
I think it is already pretty widely known that it is possible, using more standard compute approaches. Interestingly, NVidia themselves had talked about Ray-tracing on Kepler using compute. Here is a slide from their SIGGRAPH 2012 slide deck.
So what happened? I think I have touched on it on this forum before, but most of what NVidia does with gaming hardware is through the lense of maintaining different product stacks between professional (scientific and modeling) hardware and gaming. They learned that lesson the hard way during the Fermi/Kepler days, when the original Titan Black was purchased as an entry level professional card effectively cannibalizing the much higher margin Quadro segment. After that, NVidia effectively scaled back the compute functionality in Maxwell/Maxwell 2/Pascal etc, as to maintain the distinction between professional and gaming cards, while maintaining they're bottom line in both market segments. Since the compute functionality was dropped, so was ray tracing and we never heard about it for six years.
As DX12, Vulkan appeared, NVidia developed Gameworks. DX12 and Vulkan both can utilize the higher compute functionality to generate spectacular graphics effects. Gameworks is designed to achieve those same effects using more standard gaming hardware. So rather than change their hardware and again make their gaming cards more attractive to professional buyers, they try to steer developers into using other approaches to enhance graphics which would allow NVidia to maintain a differentiated product stack.
With the launch of DXR, Microsoft is bringing ray tracing to the DX12 API. This can be done, by NVidia's own admission, via compute. The RTX cards then, are just more of the NVidia modus operandi, which is NVidia developing a specific thing, to do something that can be done with more traditional compute hardware. There is already and AIDA64 Ray-trace benchmark that utilizes an FP64 engine for ray-tracing, so why not go that route? Because an RTX core isn't going to work with professional software, so again the product stack is maintained. And if NVidia didn't put out and alternate solution to do ray-tracing, then developers might start to do it via compute (as NVidia suggested in 2012).
As you noticed, my blurb above is framed in such a way as to portray NVidia's interest as protecting their professional card profits. That may be the case, but you could also argue, effectively I might add, that creating separate product stacks makes the cards unattractive to professional users and keeps the gaming hardware in the hands of the intended audience
Regardless of specifically why NVidia chose the path they did, the outcome is that games tend not to run as well on AMD hardware. Since AMD is a smaller company, they can't afford to make a different die for each market space. Vega, ultimately, is a single die designed to serve all different market segments, which it does do surprisingly well. But with all the extra compute cores on those dies utilized for professional software, they generate far more heat in traditional gaming workloads. AMD based cards would actually fare better if game developers would utilize that compute hardware just hanging out on the die. But that is exactly what NVidia doesn't want, if developers start to use it, they would need to add it back in to stay competitive which it turn would once again blur the lines between professional and gaming market segments.
And so the fight has been for the developers. AMD has opted to put their hardware in consoles and NVidia uses Gameworks. AMD hoped that developers would develop a game for consoles first (and they do) and then port directly to PC. But usually, developers care so little about the PC port that they farm that out to a separate studio who operate on a compressed time table for release (see Mortal Kombat X, Arkham Knight PC releases). Those secondary studios then have to utilize tools like Gameworks to make the PC title launch on schedule, leaving AMD discrete graphics users with a massively deoptimized title. All that does is damage AMDs mindshare in the PC gaming community, as console users rarely actually know what is inside their machine.
But regardless of intent, the end result is undeniable. NVidia will actually leverage their market position to hold back the advancement of gaming. We could have had ray-tracing via compute six years ago, but NVidia's desire to maintain product differentiation hamstrung that effort. Only now that ray-tracing could advance down the compute avenue without NVidia do they throw their weight behind a different solution. And of course, the RTX solution only works on their new lineup of cards, whereas a compute based solution would work on their older hardware and AMD hardware as well.
Nvidia has so much clout that everyone listens. Well they worked hard for it (riva TNT vs. S3). But AMD also did their homework and work hard: https://gpuopen.com/announcing-real-time-ray-tracing/, https://www.dsogaming.com/videotrailer-news/amd-shows-how-you-can-add-real-time-ray-tracing-effects-with-radeon-prorende… .Vulkan and OpenGPU push was not appreciated even though it was suppose to simplify the process.https://gpuopen.com/announcing-real-time-ray-tracing/
RE: This is a +15.5% Peak Performance Difference.
I do not understand where that % figure comes from.
Here is how I calculated the percentage difference:
((13.4 - 11.3) / 11.3 ) * 100/1=
(2.1 / 11.3) * 100/1= 0.185840708 * 100/1
18.584% performance increase of GTX2080Ti versus GTX1080Ti according to the figures reported by Anandtech.
I know how to calculate FLOPS.
I based my "guestimate" of gaming performance improvement on increase memory bandwidth and factored in how much a memory overclock helps benchmark performance on previous Nvidia cards I have looked at. I did not talk about overclocking anything.
My guestimate may be way off. I think GTX2080Ti will be capable of 16.4 % Graphics Benchmark performance improvement versus GTX1080Ti. How that translates into actual game performance may be lower. It often is.
We will only find out when the benchmark numbers are out.
RE: Well, given that the RX VEGA 56 is effectively on-par with the GTX 1070Ti in *most* titles and the RX VEGA 64 is effectively on-par with a 1080 with Overclocking again in *most* titles
I consider *most* Titles are DX11.I trust reviewers like Gamers Nexus, Hardware Unboxed etc etc. They look at DX12 and Vulkan Titiles which should be in AMD favor.They seem to think that GTX1080 still beats Vega 64 and GTX1070Ti beats Vega 56.
I do not know where you get your pricing from, but Vega 56 is still up against GTX1080 here and Vega 64 had generally been up against GTX1080Ti.Prices have tanked this week though, for a few AMD cards, plus AMD has a free game promotion. Nvidia are also doing game promotions. Vega 56 AIB cards I am interested in, 2 slot high cards, are very difficult to get. They are on pre order even now.
I suggest you look at this review, and then tell me things are great.
Vega 56 - One Year Later vs GTX 1070 & 1070 Ti - YouTube
I am in process of purchasing a Vega 56. I saw the above. I stopped. I have asked the reviewer some questions.
As for Navi. When is it turning up? 2020? I thought it is a mid range RX580 replacement?
RE: I might see about putting my nose to the grindstone with Radeon Rays over the next month or so, see if I can't coax my RX 480 to achieve some comparable results to what NVIDIA RTX is showcasing because frankly I actually believe it's more than possible to achieve some similar results... and would be fun to achieve what NVIDIA will no doubt try to claim is "Impossible" without their Revolutionary new Hardware.
Go for it.
Please create a separate post, and report the results.
It will be very interesting to see how you will be able to run Shadow of the Tomb Raider with Radeon Rays.
Here is a very recent look at Vega 64 AIB cards versus GTX1080. Can Custom Vega 64 Beat The GTX 1080? 2018 Update [27 Game Benchmark] - YouTube
I just saw this article.
Nvidia GeForce RTX 2080 Ti hands on review | TechRadar
It is looking rather positive w.r.t performance if you ask me.
I can post links too!
Don't Buy the Ray-Traced Hype Around the Nvidia RTX 2080 - ExtremeTech
Thank you for the link. I have not read it yet, I will, but let me guess.
Not many games use RTX.
It is still early days.
It's just another PhysX.
The cards are way too expensive.
You should not pre-order until you see benchmarks.
These cards are probably a placeholder until Nvidia can move to 7nm. Nvidia are using the RTX and AI cores as an excuse to add value for gamers that isn't there.
Claims that Nvidia are hiding the fact that these new 2080Ti cards do not give much of a performance uplift (15% / Maybe 10% in games) versus GTX1080Ti.
Yet the fact that AI and RTX has been added to Consumer GPU seems very important to me indeed.Not just for gaming but also for other compute / applications such as Blender Rendering, and AI for pattern recognition etc.
People should very definitely wait for benchmarks.
Probably better for most to stay with GTX1080 or 1080Ti as prices continue to drop rather than pre-order these 2080Ti's yet.I have seen articles claiming that Nvidia got hit hard by Cryptocurrency crash, and have lots of GPU's returned to inventory. This 2080Ti etc 'launch' - actually a pre-order, until 20 September, at these high prices - could be a way to get many people to throw in the towel and buy the older Nvidia cards, at lower prices, therefore reduce their inventory.
I do not like to see the current situation. I own many AMD cards, far more than the number of Nvidia GPU's I own. I bought into the RX Vega message, and it overpromised and underdelivered, the cards were late, and I still stayed with AMD cards. I have an interest in AMD cards being a success. But last time I looked they dropped as low as 8% of Steam Users just after Christmas in January 2018.
I want to see AMD do better than this. Look at what they did with Ryzen and Threadripper.
OK, thank you, very interesting article.
Based on what they seem to say, my estimate of performance improvement does not seem unreasonable then, and that would make these GTX2080Ti cards over 50% faster than an RX Vega 64 Liquid card. Vega 64 on 7nm is reportedly 1.35x (35%) faster than Vega 64 today. That means AMD would still be behind if they released Vega 64 on 7nm for gamers today.
I do disagree with some things said in the Article. This in particular.
"True, DX12 has been kinder to Team Red than Team Green, but if you bought a 2013 Radeon thinking Mantle was going to take over the gaming industry, you didn’t get a lot of shipping titles before it was retired in favor of other APIs. If you bought a Radeon in 2013 thinking you were getting in on the dawn of a new age of gaming, well, you were wrong."
Regarding the DX12 versus DX11 story. Why would Nvidia want to move their cards and games from DX11 to DX12 when their cards run so much better on DX11 than AMD cards do? Isn't it clear that Nvidia would do everything possible to prevent encouraging a move to DX12? Is there no way AMD can go back and do something to improve their newer card DX11 performance?
I still seem to be able to run Battlefield 4 in Mantle. I was running it just the other day. Battlefield Hardline does crash - I recently posted about that. The author seems to forget that AMD Mantle was a starting point for Vulkan. My radeon card I bought in 2013 runs Doom really well on Vulkan.
Another point they don't seem to make, announced just recently is the new Steam Play Beta for Linux , which would be nowhere without Vulkan & AMD.Here is the information: Steam for Linux :: Introducing a new version of Steam Play
To cut a long story short, you can now run Doom 2016 on Linux and other Windows based games will be easily ported to Steam on Linux because of Vulkan. This is a major change for Linux Users. This will also be a major change for people who have had enough of Windows.
I just tested it today. Steam Plays Beta for Linux today.
I will add another post about it next, with links to videos showing the performance.
FYI, Doom at least, worked first time and it runs really well.
"Regarding the DX12 versus DX11 story. Why would Nvidia want to move their cards and games from DX11 to DX12 when their cards run so much better on DX11 than AMD cards do? Isn't it clear that Nvidia would do everything possible to prevent encouraging a move to DX12?"
Exactly. DX12, being a low level API has the ability to run far more efficiently than DX11 ever could. But due to NVidia low levels of compute hardware on gaming GPUs, they instead build Gameworks solutions to try and achieve those effects. You can see my lengthy blurb about it above. The crux of the issue is that NVidia would rather maintain the differentiation of their product stack than really push gaming forward. That is the primary reason I don't support them.
"The author seems to forget that AMD Mantle was a starting point for Vulkan. My radeon card I bought in 2013 runs Doom really well on Vulkan.
Another point they don't seem to make, announced just recently is the new Steam Play Beta for Linux , which would be nowhere without Vulkan & AMD."
Uh-huh. How many shipping titles are there with Vulkan support? The author is making the point that when you buy a graphics card with a new feature Mantle, Vulkan, Ray tracing, whatever it is, by the time that feature is widely implemented the original graphics generation that supported it is obsolete. I can't really argue that fact.
"Based on what they seem to say, my estimate of performance improvement does not seem unreasonable then, and that would make these GTX2080Ti cards over 50% faster than an RX Vega 64 Liquid card"
Tempted to put a check minus in the reading for comprehension section, but no! I will distill the essence of the extremetech article here.
The author indicates that NVidia cards gains in previous generations were tied very closely to the fill rate gains made. They even supply this handy dandy chart.
As you can see, the author points out that despite other gains the performance in existing games seems to be tied very closely to the fill rate. The GTX 980 gained 1.50X compared to the the GTX 780 in fill rate and saw an improvement of 46% across 14 titles despite actually losing memory bandwidth. The GTX 1080 then, saw a 1.60X fill rate increase over the GTX 980 and saw a 65% increase across the same 14 titles. The GTX 2080 actually has a slightly slower fill rate than the GTX 1080, which means the gains will likely be limited.
Similarly, the GTX 1080 Ti has a fill rate of 130.2 (Gpix/s), while the the GTX 2080 Ti has a fill rate of 135.96 (Gpix/s), an increase of 1.044X. So if the trend holds, it would be 5% faster in those same 14 games.
The author does qualify that with the following.
"And those facts alone suggest that unless Nvidia managed to deliver the mother of all IPC improvements via rearchitecting its GPU core, the RTX 2080 family is unlikely to deliver a huge improvement in current games."
So you may see some additional gains, but I think even expecting a 20% increase is generous.
RE: Tempted to put a check minus in the reading for comprehension section, but no!
This forum is a laugh, it really is.
I make some comments about the RTX2080Ti series, based on some GTX1080Ti data I have and a few simple assumptions, pretty much that Nvidia know what they are doing and don't mess up the GPU core.
I get trolled and report it.
The thread disappears.
The guy in 11th place on the forum (I have no problem with him at all, he is a really nice person) gets 1000 point increase in one day from the person I reported so I vanish from your Drivers page.
Keep sticking your heads in the sand. Herd mentality in action. A herd of ostriches
If your talking about kingfish giving me a badge. That was his way of rewarding a User for helping out another User solve their problem. He has done that with several other Users in the past. I really don't expect anything from this Forum. I am not in competition with anyone. I just enjoy troubleshooting.
I am sure he didn't do it so just to get back at you. He is more professional then that.
EDIT: By the way, Any User can give a Badge to another User from what I gathered. Just need to click on the User's Profile and click "Give a Badge". If you want I'll give you a Badge if that well make you feel better. I have no idea how many points it will be worth.
If my comment offend you, then I do apologize. I was merely pointing out that author indicated that the 2080 Ti only has a modest (5%) fill rate increase over the previous model (1080 Ti) and historically speaking the fill rate seems to correlate almost directly with the gains seen in games.
You had said "Based on what they seem to say, my estimate of performance improvement does not seem unreasonable then, and that would make these GTX2080Ti cards over 50% faster than an RX Vega 64 Liquid card."
That really doesn't seem to be the case, the author is indicating based on fill rates (which historically have been a good predictor of gaming performance) we really shouldn't expect much in the way of gains in current games. So I'm not exactly sure how you read the author's data as supporting your conclusion.
I'm not sure what other situation you are referring to, so I will leave that alone. And the truth is we won't know how fast the GTX 2000 series is until benchmarks launch in September. But it may be a worse case scenario in which the cards are only 10-15% fast in existing titles but are significantly more expensive.
Been busy testing Steam Play Beta.
1. I do not want to fight with anyone. I am interested in what AMD response will be to these Nvidia cards.
2. Can you please tell me, how did the author calculate that fill rate of "109?" for the RTX2080? What assumptions did he make?
3. Here is some idea of how many Game titles now run using Vulkan, on Linux admittedly: Steam for Linux :: Introducing a new version of Steam Play. If you run Linux, can you please test it out?
Like I said earlier, we will see what the benchmark numbers come out like.
I think the pixel fill rates are reported by the card manufacturers in there own specification sheets, I don't think they are being calculated.
The stream Linux stuff is pretty interesting. Nice to see the gaming community outside of Windows continues to grow.
windows gaming is still more than double all consoles etc combined
Generally, in big budget games that are simultaneously released across platforms PC makes up a small percentage of overall sales. PC was a whopping 9% of the Wolfenstein II: A New Colossus sales. http://www.vgchartz.com/article/270793/wolfenstein-ii-the-new-colossus-sells-an-estimated-319000-units-first-week-at-ret…
These are the types of games that generally require discrete graphics hardware, or a dedicated console to play. I'm sure more people play games on mobile devices than Windows, but no developer is going to be adding ray tracing to your phone game.
<a href="https://www.wepc.com/news/video-game-statistics/"><img src="https://www.wepc.com/wp-content/uploads/2018/06/worldwide-distribution-of-games-market-revenue-from-2015-to-2019-by-segment-and-screen.jpg"/></a>
Global Video Game Market, Sales and Value
The above link is one of the few Statistics that includes Official Steam and Blizzard Data, as opposed to estimated Data.
I don't expect you to drop the $6,000 for the full market report and breakdown, but the Overview PDF provides detailed enough information for purpose.
Now on a personal note, my Studio uses the Data from Olsberg-SPI and UKIE (UK Interactive Entertainment) Research and Statistics., which provide an excellent picture of the Market as a whole, given again they source Digital as well as Physical Distribution Figures.
VGChartz for example just sources NPD Retail Association Figures (so Physical Only., which as you can see from the UKIE Datasheets; accounts for only ~30% of the Total Sales Market, and is a mirrored trend between Movie/TV and Video Games; while Music remains ~50%)
Now as you'll also notice is that Physical Retail has remained fairly stagnant, having peaked in 2010., while Digital Sales continues a roughly 10% growth year-on-year.
Why is this important, because we have a roughly 25% to 21% Disparity between the Overall Market Value.
What this does is showcase that the vast majority of PC Sales are via Digital Distribution (i.e. Steam / Origin / Marketplace / Battle.net) however this doesn't result in a substantially different skew. We also must consider the Console : PC Ratio.
If we use the UK as a template (as it will be atypical of Western., i.e. EU / NA Gaming), we can derive the estimated Market Sizes in "Real-Terms" Figures.
That is to say that in 2017 there are 32.4M Gamers., of those 12.7M Console (Xbox One / PlayStation 4) and 11.4M (Windows PC).
As we know the UK Retail Figures for PlayStation 4 and Xbox One, we can actually further break this down.
10.26M (Europe) / 3.96M (UK) Xbox One
32.79M (Europe) / 4.93M (UK) PlayStation 4
I included the EU Total with the UK Total (2017) as you'll notice there is a big difference in the Ratio Distribution., and this is because Germany, Spain and France (the key markets in Europe) which have only marginally larger markets., have a massive disparity between the Platforms being 4:1 in terms of Sales Ratio.
As an Xbox One / Windows 10 Development Studio, this essentially has remained good news., as we really don't have to put much cost or effort into European Translation, Certification and Distribution... as the market just doesn't exist to justify supporting such, nor do we have to pander to said regions in terms of Game, Story or Sensibilities Design. Instead we can focus exclusively on the Anglo / Western Market much more exclusively.
Something you'll notice though is this only accounts for 8.89M Consoles, even if we include the Nintendo Switch., this only rises to 9.56M … as such we can assume that even in 2018., there is still a number of Xbox 360 / PlayStation 3 / Nintendo Wii U Games (15.96M)., and as note the Handheld does perfectly align with the NDS and PSV sales (States 4.5M Vs. NPD 4.54M).
Now if we look over the Yearly (2017) Software Sales (UK) we get the following breakdown:
PlayStation 4 • 10.72M
Xbox One • 6.87M
Window PC • 0.48M (1.60M)
Within the Brackets, as the NPD Figures are Retail only and we know that Digital Accounts for roughly 69-70% of all Sales (which are factored in for Console Sales, as these are closed Platforms that supply NPD Figures., unlike Steam) then this means we should be looking at 1.60M (est.) Total PC Sales.
Of course we are seeing the PC Market Value (even in the UK) relatively even with the Consoles Combined,. just as we're seeing a similar 45:55 Split in terms of Market Share... and sure arguably we can say PC has a "Even Share compared to Combined Consoles" … but this is ONLY true if we're looking at the POTENTIAL Market, not the actual Unit Sales Figures.
So, how is it that we can be seeing a similar income when clearly PC Gamers aren't purchasing much Software.
Well that's simple., keep in mind the MASSIVE differential in terms of Games on PC that are either Subscription, Freemium or Heavily Micro-Transaction Focused... and what's perhaps more notable is such Freemium (Free-to-Play) Games do not have Retail, thus no "Sales" Values attached., while also supporting a larger percentage of the Market as a whole.
Remember that according to the Steam Hardware Survey, only 36% of the Total Market ACTUALLY has Minimum Requirement Hardware.
That being 4 Thread 3.2GHz CPU / 4GB Memory / R7 260 or GTX 650 Graphics.
i.e. let's assume this is universal, then this means the "Total UK Market" drops from 11.4M to 4.10M capable of actually playing the latest PC Releases.
And if we go further than that., if we look at the percentage of that market that owns a "Beyond Mainstream" Graphics Card., i.e. £400+ MSRP Graphics Card; then this drops to 8.6% or 0.98M.
Keep in mind here, that our Potential Market is essentially half the Total Software Sales for the Year., and we can't expect 100% Attachment Rate...
Avg. is 5-8%, while Good. is 8-12%... and even most AAA Titles will typically sit between Avg. to Good.
And this is important to keep in mind., because while sure Globally Speaking we would be talking about a much larger market than just the UK we're not talking about the figures being any more "Encouraging" or "Appealing" for Developers.
The only way that my Studio would implement NVIDIA Specific Features, such-as RTX, DLAA, DLAI, etc. would be if NVIDIA essentially sent us the Hardware for free under the provision that we supported Gameworks / RTX Features. Even then., it would only be for said Single Release if we weren't seeing the attachment rate figures to warrant said time investment going forward.
The PC game world is much larger as there are multiple game sources and many classic games are not on steam etc.
PC gaming has been the primary platform and it will continue to be and consoles have seen their sales fall over time as PC sales have managed to gain some grown mostly as backwards compatibility is so much better.
try playing an Xbox 360 disk on a PS4, no joy. PC games from DX5 onwards work almost universally.DRM is a problem for a few titles but patches for those problems are available.
Do you have any view on the % of PC Gamers running Nvidia versus AMD GPU these days? The reported %of AMD GPU on Steam was as low as 8% in Jan 2018. It is now back up to ~ 15%
The other 10% ~=Intel iGPU.If the Steam Hardware Survey is correct then Nvidia dominate PC gaming with discrete GPU at ~75% market share. Is it not likely developers will at least try to implement whatever features Nvidia try to promote in this case?.
"The PC game world is much larger as there are multiple game sources and many classic games are not on steam etc."
I mention that the statistics above include Steam because Valve typically do not provide such data to 3rd Parties., in-fact it's actually quite difficult to even get a full suite of statistics as developer / publisher of a title.
Good Old Gaming, Greenman Gaming, Origin, UPlay, Google Play, iTunes and Microsoft Store are already included., and their statistics are actually publicly available upon request or via their Quarterly Reports.
It's only Valve who normally lack transparency in their Figures., Blizzard-Activision are also quite cagey about said Statistics.
Quite odd that the two "Developers" are the ones who while being the most dominant Digital Retail Groups, are also the least Transparent in terms of Detailed Information as opposed to High-Level Overview Fiscal Reports.
We see a similar situation with Sony (PlayStation)., whom likes to obscure their data listing "Shipped" instead of "Sold" Units.
And yes, there can be quite a big difference between the two sometimes., especially when we're talking about Global Figures.
"PC gaming has been the primary platform and it will continue to be and consoles have seen their sales fall over time as PC sales have managed to gain some grown mostly as backwards compatibility is so much better"
This is simply false and had you bothered to check the links provided, you will have seen that despite all markets growing by (Avg.) 8.5% Year-on-Year in terms of Market Totals., what you will have also seen is that proportionally speaking the overall PC Software Sales have remains Stagnant while Console Software Sales have continued to grow proportional to the Overall Market.
As such despite there being more PC Gamers than ever before, the actual Market Share has continued to drop, as is the viability (from a Developer standpoint) been no different today in terms of Potential Market since 2009.
Yet actually this gets even worse (as a PC Developer)., because the number of Competing Software Studios has risen at an exponential rate., along with the production of Software... meaning there are 2x Studios Vs. 2009 and 3x Released Software Vs. 2009.
That means that the actual potential for a Successful Release has dropped from 8:1 down to 20:1., while for Console this remains at 5:1.
(i.e. the number of Unsuccessful Titles in capable of breaking Even Vs. Successful).
This as a keynote is why Mainstream Developers no longer develop PC Exclusive or even as a Primary Platform., because the return simply doesn't exist.
As a keynote it can be easily argued that this situation is ENTIRELY the fault of Valve (Steam)., due to no Platform Curation or Quality Control.
"try playing an Xbox 360 disk on a PS4, no joy. PC games from DX5 onwards work almost universally.DRM is a problem for a few titles but patches for those problems are available"
Try running Linux or Apple Software on Windows, no joy. See I can make vacuous and stupid statements to.
As for DirectX 5 (on-ward) working "Almost Universally"., oh come the f**k on.
DirectX 5 Games were barely compatible BETWEEN Windows 95 and 98... let alone on Modern Windows., and there is a reason for that too.
Until DirectX 8.1., Core Feature Support was "Optional" … and until DirectX 9.0c., Advanced Feature Support was "Hardware Dependant" (similar to OpenGL ARB) although at least by DirectX 9.0, Microsoft had the sense to make the Fallback for said Functionality a "Core Feature" … then by DirectX 10 it was simple... you either fully supported the DirectX Specification or you weren't allowed to support the Specification at all.
In fact most Modern Hardware actually no longer even has half of the Functions or Features that the majority of Classic Games *REQUIRE* to run.
Sure, there *might* be solutions available to emulate such behaviour but you're not going to have a flawless experience.
I'm sorry, but you're being naïve and delusional about the PC Market., and as I said, above... from my perspective as a Developer, and one that can't be simply throwing money away to appease a small subset of the potential audience, simply isn't justifiable. It's NOT going to make a return unless we happen to get extremely lucky., which you frankly can't bank on.
Big Publishers can do this, as the success of their Core Franchises essentially covers the costs and risks … plus it's good PR for them.
Steam Hardware Surveys are unfortunately not an accurate representation of the Hardware Market,. because they're using Global Figures.
As such the Oceania (APAC) Regions is approx. 95% NVIDIA Graphics and 98% Intel Processors... which given they're the largest PC Audience (1.2 Billion)., can give a skewed view on the Market as a whole.
More over we need to keep in mind that very few actually own their own PCs, but instead current game at dedicated Internet Café … and I'd wager that Services such-as GeForce Now! have the potential to become popularised., but we'll see given local publishers can make more from individual Café licenses and said Café are not going to want to compete with a Game Streaming Service like GeForce Now!
So we'll have to see what happens there... plus it gets difficult to really predict how such will affect Developer / Publisher Revenues.
I think I'd rather tackle that in it's own Thread, as while I do see Subscription Services (EA Access, Xbox Game Pass, etc.) as the future of Digital Retail., but what this means from the business standpoint is … well it's something I think needs to really be hammered out over the next few years to avoid the same mistakes of IAP / MTA that occurred in the Mid to Late 2000s., or in the modern incarnation "Loot Boxes".
In any case we MUST ignore APAC as it's a "Special Case" Market., instead if we focus exclusively on North America and Europe.
Well we see a very different breakdown...
As it stands Intel Graphics (and for arguments sake let's actually include all APU, as most are below the "Minimum" Specification) … then they account for approx. 64% of the Total Market., which is the figure I used in my previous post.
This leaves 36% of the Total Market split between NVIDIA and AMD Graphics.
What you might find interesting however is the Total Number of DirectX 12 Vs. DirectX 11 Hardware., remember that in essence we're talking about AMD Graphics from 2012 to 2018 and NVIDIA Graphics 2014 - 2018.
The Split here is 54% DirectX 12 Vs. 46% DirectX 11... and this is important, because we're talking about the 2018 Hardware Breakdown.
So we're down to 20% of the Total Graphics Market has a "Modern" Graphics Card.
If we further break this down into the AMD Vs. NVIDIA Split., well we have 28% Vs. 72%... sure it's heavily skewed toward NVIDIA here, but keep in mind we're not finished yet; and bare in mind that we're down to 14% of the Total Market (for NVIDIA).
The last thing we need to do is break this down into Mainstream Vs. Enthusiast... (i.e. GTX 1030/1050/1060 Vs. 1070/1080/Titan) where we see a complete flip of 83% Vs. 18% … meaning that our Total Market for those who "Might" Upgrade to RTX 2070 or above., is 2.5%.
In blunt terms here... IF a Developer was already supporting Gameworks, then Supporting RTX Features might be worthwhile.
After all, you're already supporting said Ecosystem., but even then we don't know yet just how difficult it is to implement said Features and Functions.
And this is a major factor. Remember how DirectX 12 Capable Systems account for 54% of the Target Audience., but how many Developers are actually supporting DirectX 12? We have less than 2 Dozen Titles in the past 24 Months.
What's more, of those titles only 3 Support Native DirectX 12 as opposed to DirectX 11on12... which is merely using the DirectX 12 API as a Foundation, while for all intended purposes the Engine itself is still DirectX 11., so few (if any) DirectX 12 Enhancements, Features or Performance Gains are actually being used., instead support in said case is little more than a Checkbox Feature, like say supporting Anti-Aliasing via FXAA or SMAA; as opposed to actually supporting the various Optimised, High-Performance and High-Quality Anti-Aliasing Methods., such-as TXAA, CMAA, etc.
I think this is why NVIDIA are focusing so heavily on DLSS (which is a Plug-and-Play Anti-Aliasing), because it's a way of claiming support., even if essentially it's just as Costless / Lightweight as FXAA. Careful manipulation of Performance Tables are a specialty of NVIDIA., as they're almost certainly comparing to MSAA 4x or SSAA 2x., which are quite performance intensive; as opposed to FXAA, TXAA or CMAA which are each quite lightweight with decent Image Quality, and if DLSS does produce a noticeable Image Quality improvement will be subjective (but right now, we simply have no idea; as they're actually comparing it in terms of DLSS or AA-On and AA-Off, even though performance charts are comparing the performance overhead of AA Vs. AA)
Still, if you're not using Gameworks... well adding it just for DLSS, which regardless if you're using anything else DOES come with a performance hit., not to mention the integration time; well where's the benefit for Developers for such a small Audience? While at the same time pissing off their existing potential audience?
It's a hard sell, and almost certainly why NVIDIA is "Partnering" (likely bribing via underwriting development costs, or free hardware) in order to get some game to even support DLSS at launch. God knows what they've given to the Studios that are supporting RTX Features like Ray-Tracing., because that almost certainly requires a substantial alteration of the Engine to support.
When Gamespy folded it hurt multiplayer for over 100 titles. Punkbuster has faded for VAC and other security choices.
There are a lot of upcoming games for the PC so it might be an idea to get a better video card as some of the new shooters are likely to be demanding
Gamespy Services being shutdown was hardly sudden., and there were already a handful of emulation services available when it happened.
Punkbuster has NOT been replaced by VAC... primarily because VAC doesn't even perform the same task., it's merely an Automated Report and Reputation System; it doesn't prevent any Code Injectors / Hacks / Services.
Punkbuster is still heavily used by most FPS Games.
With this said... I don't see what that has to do with the Graphics Card.
Nor do I see how the number of PC Releases has anything to do with the Graphics Card either... as very few will actually require anything more than a Mainstream GPU to play at fairly high (if not maximum) Graphical Settings.
If you'd like to actually read (and pay attention) to the information presented within this thread instead of going off on non-sequitur arguments., then you might actually learn something.
As it stands you're just showcasing how misinformed you are.
Just curious as I own over 700 games and literally only a few use punkbuster that I know of. Even several that did have released patches removing it. When I look at PB setup it has less than 10 games in the pull down menu now. Is punkbuster using a different setup in the new titles you say use it? Can you name some modern AAA titles that use it? Just curious as I don't usually pay much attention to these things and kinda thought it's usage had almost dropped off to non-existence myself. I don't claim to be any expert at this and found your conversation interesting.
pokester wrote: Just curious as I own over 700 games and literally only a few use punkbuster that I know of. Even several that did have released patches removing it. When I look at PB setup it has less than 10 games in the pull down menu now. Is punkbuster using a different setup in the new titles you say use it? Can you name some modern AAA titles that use it? Just curious as I don't usually pay much attention to these things and kinda thought it's usage had almost dropped off to non-existence myself. I don't claim to be any expert at this and found your conversation interesting.
Punkbuster seems to have faded with the move to downloaded games. Steam and Origin have various checks to prevent tampering with games.
I have lots of CD and DVD games and a few bundle Punkbuster, but since then it has been replaced with Denuvo and other schemes.
Steam for Linux :: Introducing a new version of Steam Play. If you run Linux, can you please test it out?
Just out. Nvidia Turing RTX - Analysis and Performance Predictions - YouTube
Reckons 20-25% faster.
with mining down the crapper, prices should be softer for used video cards by now
and i need specific make/model before i start peeling off a bunch of franklins
Retrieving data ...