Triple A games are often poorly optimized. Paying more than $200 US for an 8GB graphics card is far too much. Both things can be true at the same time, why defend one? Moreover pushing for more VRAM doesn't = defending poorly optimized games.
Comments
Log in with your Bluesky account to leave a comment
Also while many of the games in question were poorly optimized upon release, their use of VRAM wasn’t the issue. For example, S.T.A.L.K.E.R. 2: Heart of Chornobyl using 9GB’s of VRAM with the Epic settings isn’t an optimization issue.
Rather the issue is the amount of time game developers have to waste trying to get their games to work using high/ultra/epic type settings on 8GB cards that should have been phased out of the mid-range 5 years ago.
He is right, if consoles as the lowest common denominator have 13+ GB of VRAM then PCs should start at 13+ not half that or whatever else is being pushed by Nvidia and AMD.
I agree. The beginning point of the "mid tier" (aka the nvidia xx60 or the amd x600) should have no less VRAM than the consoles have access to. Anything less means that gamers should not bother buying a card and should instead buy a console.
The crazy part is. While 12GB might be good enough for the 5060 or 8600 in 2025, the next generation of consoles will be fighting with the 6060 and 9600. PS6 with 20GB of accessible memory for graphics isn't crazy; So anyone who buys a 5080 with only 16GB is going to have a bad time.
Speaking as a game dev(although pretty far from AAA), memory optimization is the trickiest type of optimization to get right and there's often not huge gains to be had.
No hate but I gotta blame Digital Foundry for dedicating all of last year to defending 8gb GPUs when there was so much anger directed at Nvidia's stingy VRAM policy.
Also, IMO people should stop smearing games as unoptimized for using more than 8gb. There is nothing wrong with that. Can't expect devs to dedicate an inordinate amount of time "optimizing" games for GPUs that are dead and buried.
Also now they kinda criticize it on their DF directs but it is really mild. I'm not saying they are paid or anything, i'm not an idiot, but they clearly have a pro nvidia bias look at their 4060ti review is one of their most dislikes videos. All caused by going super easy on a really bad product
Ah, I completely see where you're coming from in that case. I do agree with you - but I think they have to work to keep a balance since they need to keep companies sweet since they basically need to work with them. I think it's a difficult, delicate balance that they don't always get right!
It is not like they defended it directly but first of all they were "diplomatic" about it and Alex many times says in his videos how "the texture quality is really good and it doesn't stutter even on 8gb" and some of those games had issues loading some textures even if they didn't stutter per se.
I was about to say this. Don't get me wrong i love the fellas and the work they do over at DF. But i cringe so much with them dancing around the 8Gb issue until it became clear it is a big problem. Tbf at least they now acknowledge it (kinda) on their DF directs talking about the 4060
For me the biggest smoking gun is Dragon Age The Veilguard, they titled their video "simply brilliant on PC" praise everything about the game, im also playing the game and it is beautifully omptimized, and when it came time to texture quality? Oh shit i have to use medium textures on 8gb GPUs? Damn
Alex doesn't even mention how a game that runs otherwise great on a 7600/4060(ti) has to use medium textures to not spill over the VRAM buffer. This is the issue we've been warning, is not "the bad ports" DA is a great game made for PC and if the 4060 had 12/16gb you could play with ultra textures
Texture quality is one of the things that, set at ultra, makes your game look much better and costs you 0% performance if you just have enough VRAM, people defending this is beyond me. AMD is usually better at this than Nvidia but the 7600 at 8gb was also a mistake.
Totally agree. It's more about pointlessly bottlenecking cards. This is what's crazy to me: AMD clearly loads up their cards with vram at minimal additional cost. Nvidia is practically holding vram hostage as a way to upsell to a higher class of card.
It seems like these days implementing DLSS or FSR counts enough for optimalization - but I blame publishers not devs. Publishers keep pushing for releases of unfinished games. On the other hand you have greedy companies limiting lower end GPUS to 8GB seems like system is designed to punish consumers
There’s no reason that midrange cards should have less than the PS5/Series X which is about 12 or 13GB of usable VRAM. The 3070 would still be a decent card today if it had shipped with 12 or 16GB.
It’s insane like even considering game requirements and manufacturing advances have plateaued a bit, it’s still pathetic. The 1070 had 8gb we’ve only gotten 50% more on the x70 in FOUR gens??
To be honest I'm with electrical zebra here, why can't devs optimize games for my X1300? Why do I feel the need to upgrade with every new release? I find it so annoying and I've been thinking about quitting all this lately.
Comments