I have some big concerns about this DLSS multi-frame generation stuff. As I understand it, it generates multiple frames between two 'traditionally' rendered frames.
User input will only apply to the rendered frames, so while I'm drawing 120fps, my input might only be sampled at 30 or less?!
User input will only apply to the rendered frames, so while I'm drawing 120fps, my input might only be sampled at 30 or less?!
Comments
Digital Foundry measured 60ms of latency in Cyberpunk which I assume is running at over 120fps.
Watching the reflex video annoyed me cause they invented some new measure called PC latency and somehow claim there's only 1ms of latency now. I'm assuming that means latency added by the engine?
I wouldn't put deceptive marketing beyond them.
DLSS framegen already has bad input latency issues and now they just want to make that even worse.
I think people forgot games actually need to be played :/
nVidia is kinda forcing it by creating an FPS arms race which is an entirely meaningless feature but consumers like bigger numbers and then complain in reddit the game feels like shit
Trying to change that after the fact is an immense amount of work and can break future integrations.
So even lightweight titles are laggy. It is what it is.
My point is that Nvidia cares about AI over anything else.
https://www.theverge.com/2024/8/20/24224270/nvidia-g-sync-module-mediatek-partnership
And yeh, for fast paced competitive shooters or fighting games the lower latencies makes a big difference.
It's not just visual, it's more about the feel