• Opisek@lemmy.world
    link
    fedilink
    arrow-up
    95
    arrow-down
    2
    ·
    3 days ago

    Fake resolution is what it is.

    And you know what it does have one use for me. I do like me my 4K monitors, but some games are simply too much for that. And rendering them at lower resolutions almost NEVER works without completely breaking full screen or something else. DLSS on the other hand pretends to be 4K and everything works again.

    • zurohki@aussie.zone
      link
      fedilink
      English
      arrow-up
      57
      arrow-down
      1
      ·
      3 days ago

      Fake resolution has it’s place, the problem is when Nvidia pressures reviewers to put its cards running a fake resolution against other cards running native resolution on benchmark charts.

    • Rai@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      21
      arrow-down
      1
      ·
      3 days ago

      I’ll take fake resolution for other framerates as long as it looks good enough! I play at 1440p though, because I don’t think mid-high level hardware is really there for 4k120+ yet.

      • thedirtyknapkin@lemmy.world
        link
        fedilink
        arrow-up
        19
        ·
        3 days ago

        i just wish it wasn’t the general direction the industry had decided to push things.

        it’s become the expected norm. it’s the performance metric games are optimized to hit now, and it’s far from prefect.

        i was just playing red dead 2 yesterday with dlss and i was legitimately struggling to do some things due to the artifacting. like there are some small missions and challenges that require you to find and shoot specific tiny birds with a bow, but dlss struggles with small things flying across a dynamic background. the birds would literally fade in and out of existence.

        same thing with trying to snipe distant heads. the little red fatal zone indicator would ghost like hell and fade in and out.

        like, it may be better than needing to drop your resolution, but it still kind of sucks sometimes.

        • Natanael@infosec.pub
          link
          fedilink
          arrow-up
          2
          ·
          2 days ago

          95% of those issues would disappear if there was a rendering hint layer for the games to use to mark which details needs to be rendered in higher quality, so the game engine would ensure that important details doesn’t disappear.

      • Zaphod@discuss.tchncs.de
        link
        fedilink
        arrow-up
        3
        ·
        3 days ago

        My 7900XT works reasonably for 4k well in most games - though admittedly I have to turn graphics down to Medium in alot of cases to get 100-ish fps with upscaling and frame-gen on quality settings. Except Cyberpunk, it ran really well with high settings.

        I’d guess in about 3 years it should be much better

    • Sustolic@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      3 days ago

      Rendering anything below native resolution is usually also blurry as hell, at least for me.

      Things like FSR is the only thing that saves my 6 year old 5700 XT from getting obliterated when using my 1440p monitor.

      • felsiq@lemmy.zip
        link
        fedilink
        arrow-up
        8
        ·
        3 days ago

        If you pick a resolution that requires fractional scaling (eg 1080p on your 1440p monitor) it’ll look real dogshit because it’s trying to represent one game pixel with like one and a half real ones along either direction. A resolution that would use integer scaling (ie 720p for your monitor) will just use two pixels in either direction to show one game one (like four pixels all showing the same thing), so it’ll be more pixellated but much less blurry and gross. FSR is the better solution most of the time, but if you did want to go below native again that’d make it a little less gross.

          • felsiq@lemmy.zip
            link
            fedilink
            arrow-up
            6
            ·
            edit-2
            2 days ago

            4K would go to 1080p for best results (for 3840x2160 screens rather than true 4K, but I’m assuming that’s what you’ve got), and should be much more playable on laptop hardware that way.
            Edit: oops didn’t see Beryl already answered this lol