• Rai@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    4 days ago

    I’ll take fake resolution for other framerates as long as it looks good enough! I play at 1440p though, because I don’t think mid-high level hardware is really there for 4k120+ yet.

    • thedirtyknapkin@lemmy.world
      link
      fedilink
      arrow-up
      19
      ·
      4 days ago

      i just wish it wasn’t the general direction the industry had decided to push things.

      it’s become the expected norm. it’s the performance metric games are optimized to hit now, and it’s far from prefect.

      i was just playing red dead 2 yesterday with dlss and i was legitimately struggling to do some things due to the artifacting. like there are some small missions and challenges that require you to find and shoot specific tiny birds with a bow, but dlss struggles with small things flying across a dynamic background. the birds would literally fade in and out of existence.

      same thing with trying to snipe distant heads. the little red fatal zone indicator would ghost like hell and fade in and out.

      like, it may be better than needing to drop your resolution, but it still kind of sucks sometimes.

      • Natanael@infosec.pub
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        95% of those issues would disappear if there was a rendering hint layer for the games to use to mark which details needs to be rendered in higher quality, so the game engine would ensure that important details doesn’t disappear.

    • Zaphod@discuss.tchncs.de
      link
      fedilink
      arrow-up
      3
      ·
      4 days ago

      My 7900XT works reasonably for 4k well in most games - though admittedly I have to turn graphics down to Medium in alot of cases to get 100-ish fps with upscaling and frame-gen on quality settings. Except Cyberpunk, it ran really well with high settings.

      I’d guess in about 3 years it should be much better