Hello AMD community,
As we continue to see advancements in gaming technology, it is natural to question what future gaming requirements might look like. One topic that comes up frequently is the necessity of having 20GB VRAM or more for gaming purposes. While some high-end graphics cards currently offer this amount of VRAM, is it really necessary for future gaming?
I am curious to hear your thoughts on this matter. Do you think that 20GB VRAM will become a necessary requirement for gaming in the future, or do you believe that 20/24GB VRAM is overkill? Are there any particular factors or developments in the gaming industry that might make 20GB VRAM more necessary in the future?
Personally, I am on the fence about this topic and would love to hear some insights from others in the AMD community. Let's start a discussion and see where it takes us!
Oh yeah, I remember Hawaii and GCN 2 I think. That's the year I joined AMD. Do you remember the "Nano"? By the way, I wanna share a quick (funny) story with you. At the time, one of our senior managers had an idea for a launch event/tech day. The idea was to have said event inside a dormant volcano in Hawaii...seriously! Well, he didn't like me much because I made three points to him that killed his idea right away: 1) Safety 2) Logistics/Tech 3) Ventilation/Heat!
And you know, I'm confident in our R & D, Engineering, and Product Mktg teams; over time, they have proven to me (your everyday gamer) that they do listen to us, our customer's "demand".
Hey @Sam_AMD!
I'm looking at it like this: If AM4 runs like a champ at 2x8GB, AM5 right now seems to favor 2x16GB, I can can see that going up to 2x24 or 2x32 by the end of its life cycle. AM6 will probably start there, and we'll be around 2x64GB by the end of its life cycle. I can also understand if install rates for Zen4 are slow right now, given the economy and a lot of people opting to buy groceries over CPUs. No slam against AMD, but they might have shot themselves in the foot a bit with the 5800X3D. That thing is going to be a gaming stalwart for years, and guys like me who are late adopters are going to gun for that over the cost to an AM5 platform upgrade even if the AM5 X3D chips are that much better. I know that's going to be my plan. Psst: Maybe tell your guy in Marketing that there is some interest in a 6 core X3D variant if it's priced right. We know it can be done (the 7900X3D only has the 3D V-cache on 6 cores!)
On the GPU side... We're already at the 8GB VRAM floor. Look at the flak the Evil Green Monster is taking over their $600 "budget" card only having 8GB! I also think that we're going to see budget discrete GPUs go the way of the dodo in the next product cycle or two with the way the new laptop APUs are being shown to perform. I saw some news from one of the TechTubers showing off one of the new Zen4 gaming laptops (I think it was the 7840U) and I'm reckoning if they can nail the power delivery for a bigger desktop application they'll have a winner in the budget gaming market.
At the same time, let's not forget that some of us still like having that big GPU sitting in the case. There's always going to be a place for a GPU in a PCIE slot. And don't let anyone over there forget about Moore's Law. Some of us still hold that to be the Holy Gospel of PC Gaming tech.
I have a 5800X3D, and loving it. And, yes, I agree, this CPU will be "the" gaming CPU for a while. I remember when my friend Robert Hallock (former director of the Ryzen team) gave me a sneak peek at the 3D V-cache "intro" video and how our engineers figured out how to add more memory onto the CPU. It all made sense to me.
I think the future of the consumer segment is going to stay as it is for a little while longer. The macroeconomics, inflation, COVID-19, employment, the "Fed", and the upcoming elections are all going to impact how we "play" games. Know what I mean?
@Sam_AMD wrote:I have a 5800X3D, and loving it. And, yes, I agree, this CPU will be "the" gaming CPU for a while. I remember when my friend Robert Hallock (former director of the Ryzen team) gave me a sneak peek at the 3D V-cache "intro" video and how our engineers figured out how to add more memory onto the CPU. It all made sense to me.
Had I gotten my hands on a lower-end GPU to replace my RX570, I would have been looking to a Zen4 X3D as my next upgrade, complete with a lower-mid range GPU. With that 6800XT you and the team so graciously hooked me up with, I might actually sit tight on Zen3 for a while longer and get a 5800X3D for myself. For the games I play, it'll be OPAF (I'm sure you can figure out that acronym) for a couple of years yet. The AAA titles I'm playing are generally multiplayer shooters, so they're setting them up for people to run them on a Potato... IE: Zen1 or Intel 6th Gen in either a 4 or 6 core setup and a GPU with 6GB of VRAM for 1080p60. Heck, I ran Halo Infinite multiplayer for the first 8 months I played it on an FX-8120 (not a supported CPU according to Microsoft) and the venerable RX570 4GB. It absolutely struggled, but it ran.
I think the future of the consumer segment is going to stay as it is for a little while longer. The macroeconomics, inflation, COVID-19, employment, the "Fed", and the upcoming elections are all going to impact how we "play" games. Know what I mean?
You're not wrong at all. I'm chomping at the bit for our elections in 2025 here in Canada. The way things are currently playing out, we're all going to be broke before then. My Opa said it best: "When it comes to politics, they're all trying to screw you. Vote for the one that will screw you the least."
Some kind of bottleneck in case of pc gaming rendering with upcoming unreal 5 games ?
Nah, unless you use your rig for productivity purposes, like video editing, Photoshop and other production related usage, then yeah, 64GB should be the minimum. I've play a good variety of games over the years, and recently played COD MWII Campaign, DSR....I recall seeing my system RAM going up to about 15GB which was the highest I've seen thus far.
So, my conclusion is, while 16GB is still good, you're gonna need more sooner or later. Hence, my decision to bite the bullet and go for 32GB for my two gaming rigs. Might as well get it over and done with, that's my motto.
VRAM, unfortunately, is NOT something that can be easily augmented (unless you've got mad skill like the guy who'd modded his RTX 3070 8GB to 16GB), and unlike system RAM, you're stuck with whatever the card has. Unfortunately for our nVidia brethren, nVidia chose to ignore game devs pleas for more VRAM and stuck 8GB even on their relatively powerful cards, like the RTX 3070 series.
I was surprised that in the MLID vid, the game generalist had said they had wanted more VRAM which would make their work easier, but nVidia being industry leader, chose the route of less VRAM to ensure planned early obsolescence. Hogwarts Legacy and RE4R look especially bad after the patch, yes, those games do NOT stutter heavily any more, due to insufficient VRAM and spillage, but the texture on object popping in and out as the 8GB VRAM is filled, swapped and filled again is just bad. And it even happens when your character is standing still, like I'd said, I feel for those who had paid top dollars for their 8GB cards, only to see performance fall off like that.
I'd had an amusing debate with an nVidia fanboy, in another forum obviously, he bragged that nVidia 8GB cards have the more expensive GDDR6X, while "cheap" AMD uses GDDR6 RAM modules. He said that GDDR6X was faster, but I pointed out that having less of a faster RAM modules is useless when it's insufficient, and that having more of a slower RAM modules is way better when games require more VRAM.....as exemplified by Hogwarts Legacy, TLOU Pt1, RE4R and such newer games.
I told him to look at the RTX 3070 vs RTX 6800 comparisons at HUB, or from Daniel Cowen. I likened it to the good ole i5 2500K vs i7 2600K days, the former (being just 4C/4T) overclocked better than its i7 (4C/8T) sibling, that's true. But when games require more threads, the i5 would lose out because, no matter how high the OC, it simply lacked the thread count, handing the win to its lower clockspeed i7 sibling.
I don't recall him rebutting my argument on this, obviously he can't because there's empirical evidence behind my argument. It's weird, mere and obvious facts can stun fanboys into complete avoidance and silence on the subject.
I a have 7900xtx i am using 3440x1440
I am already hitting 16gigabytes of vram usage pretty often. 20gigabytes is definitively great, actualy solid performance and specs of 7900xtx 24gb/384bit . That took me instantly here to Team Red out of the Team Green, without any question 4080 is just completely broken product at high-end tier, in my opinion even for the same price its storage and bus are nothing more then greedy midlle finger showed up to its buyer. 20+gigs of vram is yum, yum.
Sadly when someone checks out new productions or ports that person could ask herself if 50gigs will be enough before 2030... Sad reality to live in.
This has positive sides too. "Monster truck" races will speed up technologies like VR.
The Youtube channel Moore's Law Is Dead did an interesting podcast that covered VRAM, PC vs Console game development, and whatnot. Look up Broken Silicon episode 201 if you have a couple hours to kill.
I actualy just watched that podcast with call of duty graphical designer with Tom. It was almost, like listen to myself , pretty funny :). However many people though 8 or 12gb is more than enough like, forever from some weird reason. I realy dont now , who spreaded this "vram doesnt metter " opinion, however it obviously worked, pretty well. We actualy need more developers talking not only about "optimizations" of their games, but also what they realy need from gaming hardware, otherwise its very missleading.