And you know what it does have one use for me. I do like me my 4K monitors, but some games are simply too much for that. And rendering them at lower resolutions almost NEVER works without completely breaking full screen or something else. DLSS on the other hand pretends to be 4K and everything works again.
Fake resolution has it’s place, the problem is when Nvidia pressures reviewers to put its cards running a fake resolution against other cards running native resolution on benchmark charts.
I’ll take fake resolution for other framerates as long as it looks good enough! I play at 1440p though, because I don’t think mid-high level hardware is really there for 4k120+ yet.
i just wish it wasn’t the general direction the industry had decided to push things.
it’s become the expected norm. it’s the performance metric games are optimized to hit now, and it’s far from prefect.
i was just playing red dead 2 yesterday with dlss and i was legitimately struggling to do some things due to the artifacting. like there are some small missions and challenges that require you to find and shoot specific tiny birds with a bow, but dlss struggles with small things flying across a dynamic background. the birds would literally fade in and out of existence.
same thing with trying to snipe distant heads. the little red fatal zone indicator would ghost like hell and fade in and out.
like, it may be better than needing to drop your resolution, but it still kind of sucks sometimes.
95% of those issues would disappear if there was a rendering hint layer for the games to use to mark which details needs to be rendered in higher quality, so the game engine would ensure that important details doesn’t disappear.
My 7900XT works reasonably for 4k well in most games - though admittedly I have to turn graphics down to Medium in alot of cases to get 100-ish fps with upscaling and frame-gen on quality settings. Except Cyberpunk, it ran really well with high settings.
I’d guess in about 3 years it should be much better
If you pick a resolution that requires fractional scaling (eg 1080p on your 1440p monitor) it’ll look real dogshit because it’s trying to represent one game pixel with like one and a half real ones along either direction. A resolution that would use integer scaling (ie 720p for your monitor) will just use two pixels in either direction to show one game one (like four pixels all showing the same thing), so it’ll be more pixellated but much less blurry and gross. FSR is the better solution most of the time, but if you did want to go below native again that’d make it a little less gross.
4K would go to 1080p for best results (for 3840x2160 screens rather than true 4K, but I’m assuming that’s what you’ve got), and should be much more playable on laptop hardware that way.
Edit: oops didn’t see Beryl already answered this lol
Fake resolution is what it is.
And you know what it does have one use for me. I do like me my 4K monitors, but some games are simply too much for that. And rendering them at lower resolutions almost NEVER works without completely breaking full screen or something else. DLSS on the other hand pretends to be 4K and everything works again.
Fake resolution has it’s place, the problem is when Nvidia pressures reviewers to put its cards running a fake resolution against other cards running native resolution on benchmark charts.
I’ll take fake resolution for other framerates as long as it looks good enough! I play at 1440p though, because I don’t think mid-high level hardware is really there for 4k120+ yet.
i just wish it wasn’t the general direction the industry had decided to push things.
it’s become the expected norm. it’s the performance metric games are optimized to hit now, and it’s far from prefect.
i was just playing red dead 2 yesterday with dlss and i was legitimately struggling to do some things due to the artifacting. like there are some small missions and challenges that require you to find and shoot specific tiny birds with a bow, but dlss struggles with small things flying across a dynamic background. the birds would literally fade in and out of existence.
same thing with trying to snipe distant heads. the little red fatal zone indicator would ghost like hell and fade in and out.
like, it may be better than needing to drop your resolution, but it still kind of sucks sometimes.
95% of those issues would disappear if there was a rendering hint layer for the games to use to mark which details needs to be rendered in higher quality, so the game engine would ensure that important details doesn’t disappear.
My 7900XT works reasonably for 4k well in most games - though admittedly I have to turn graphics down to Medium in alot of cases to get 100-ish fps with upscaling and frame-gen on quality settings. Except Cyberpunk, it ran really well with high settings.
I’d guess in about 3 years it should be much better
My 3080 has trouble keeping 60 on newer stuff 😑
60 isn’t enough for me—I want 120+! But that’s personal preference hahaha
Rendering anything below native resolution is usually also blurry as hell, at least for me.
Things like FSR is the only thing that saves my 6 year old 5700 XT from getting obliterated when using my 1440p monitor.
If you pick a resolution that requires fractional scaling (eg 1080p on your 1440p monitor) it’ll look real dogshit because it’s trying to represent one game pixel with like one and a half real ones along either direction. A resolution that would use integer scaling (ie 720p for your monitor) will just use two pixels in either direction to show one game one (like four pixels all showing the same thing), so it’ll be more pixellated but much less blurry and gross. FSR is the better solution most of the time, but if you did want to go below native again that’d make it a little less gross.
So what should I downscale 4k to? 4k annoys the shit out of me on my laptop because it’s pointless at that size display
4K would go to 1080p for best results (for 3840x2160 screens rather than true 4K, but I’m assuming that’s what you’ve got), and should be much more playable on laptop hardware that way.
Edit: oops didn’t see Beryl already answered this lol
You should go 1080p