y'know, this reminds me how there's innovation in limitation - that limited hardware forces programmers to come up with creative, and sometimes revolutionary, ways to save memory or processing power
One funny thing is running well in the future depends on accurate predictions. Crysis really does still run like shit even on a lot of modern machines because the devs assumed CPUs would keep getting faster at the same rate
The "Crytek assumed single threaded performance would keep improving" thing is a myth. Consumer grade, dual core processors started showing up a year before Crysis launched and modern CPUs are miles ahead even on single threaded performance.
The problem with Crysis is everything was done in LUA.
Nope! Just plain Lua for core systems like AI. Crytek's didn't build for future hardware, they built for trailers. The graphics were cutting edge but poorly optimized. The AI was mechanically impressive but bogged down everything.
It played great for ads and ran like ass. People bought into the PR.
I recall reading the Sonic devs wants to support older consoles as long as possible. Not sure about how their testing methods work, but I wonder if they still often test on original PS4s and Switches to ensure their games are actually optimized for them
Well, this blew up. I have nothing to promote so check out my cool artist friends (heads up: NSFW stuff)
@molochtavros.bsky.social
@berryscrawls.bsky.social
@magewizards.bsky.social
@frogtoa.bsky.social
@txddy3d.bsky.social
@quietscrappy.bsky.social
@usuallygloomy.bsky.social
The most sad thing is that its 2025, and I'm still the guy with the crappy laptop, 4GB of RAM and an integrated graphics card, and even on Linux, its really painful
I tried playing the native Linux versions of some games when I tried out Pop OS for a bit in 2019 and they ran so much worse when than the Windows versions. they were pretty much unplayable. and I have 8GBs of RAM in my laptop and not the best CPU and GPU either
Gunny how it goes back and forth. As a kid? Nobody ever had good enough hardware to run games at full. 10 years ago? Half the industry was still targeting old consoles and so only a handful of games would really abuse your hardware.
Now? Even a new midrange card can't run new AAA games at full.
Digital Extremes does something close to this: They have a really old computer that they test run the game's new content on the lowest settings possible, and if that old ass computer can't run it well, they go back and keep working on the new content until it does.
I’ll take the job.
I remember the excitement when writing RPGII on an IBM System/36 when it got a main memory upgrade from 512kb to 1mb. Ran MAPICS for 20+ users on that 😀
I'm that guy, please hire me XD, there I have a pretty good laptop (for one made in 2016) but most games now I just simply can't run because the optimization is SHIT
As someone who has survived on computers people threw out when they got new ones or donated to thrift stores most of my life, and has been using computers since Windows 95 and seen the completely unnecessary software bloat happen, I approve of this proposal.
I'd take this job in a heartbeat not only because I'd have no problem bitching them the hell out, but because I am the type of person who has spent ours on lemons doing everything outside of the box to make programs run. They'd have no excuse because I'd have an extremely detailed log book.
The funny thing is that modern games runs like shit so much that they have to use AI to generate three more extra frames in between each rendered frame.
So, 3 out of 4 frames are kinda fake now.
Someone said "for those memeing about playing ai games, no, we're playing ai games now"
This reminds me of the time I was QA testing at EA and on one game I tested on a PC that was well below the min spec for the final shipped game. It was a painful time to enter bugs into the database.
considering the lifetime of most video game consoles are about 10 years, i think making sure your game works on at least a 10 year old system should be mandatory
Being the guy with 4gb of ram and rural internet connection is literally me and I can tell that nobody in this society knows how to optimize anymore (except the SM64CoopDX group kinda)
If this had started in 1999, we would have been spared SO MUCH PAIN, especially from developers/PMs who jump into thriving projects, take a shit in front of everyone, demand that they smell deeply, and then prance off to the next company when the previous one implodes.
Comments
The problem with Crysis is everything was done in LUA.
It played great for ads and ran like ass. People bought into the PR.
i love Scarlet & Violet but i doubt switch 6 or whatever will run them (or let them run smoothly)
@molochtavros.bsky.social
@berryscrawls.bsky.social
@magewizards.bsky.social
@frogtoa.bsky.social
@txddy3d.bsky.social
@quietscrappy.bsky.social
@usuallygloomy.bsky.social
Now? Even a new midrange card can't run new AAA games at full.
I guess I'm showing my age by remembering when the big selling point of consoles was that they "just work" 🤣
I remember the excitement when writing RPGII on an IBM System/36 when it got a main memory upgrade from 512kb to 1mb. Ran MAPICS for 20+ users on that 😀
Good.
So, 3 out of 4 frames are kinda fake now.
Someone said "for those memeing about playing ai games, no, we're playing ai games now"
(If you don't have a QA dept, you are very dumb and deserve everything you get.)
Agree with everything and we should turn it into a law.