• IndustryStandard@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    1 day ago

    “I remember 14 years ago when my GPU used to draw almost 400 watts. Crazy right? Anyways, how is GPU power consumption these days?”

    • Lv_InSaNe_vL@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      16 hours ago

      “I budgeted about $500 for my GPU, that should be able to get me a high end card right?”

      (That’s like $750 today, adjusted for inflation, btw)

  • AoxoMoxoA@lemmy.world
    link
    fedilink
    arrow-up
    72
    ·
    2 days ago

    A buddy of mine was locked up from 03 - 17. He was asking me, questions like " do you have Playstation 3, what kind of phone do you have?" …

    He said " man I know I missed a lot but people are so rude now. I was talking to my cousin and instead of talking to me he was looking at his phone. That is disrespectful." I said yeah man the world changed a lot. Felt terrible for him trying to integrate back into this bull shit.

    He went away for the craziest shift in society I could imagine.

    • ivanafterall ☑️@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      1 day ago

      I miss 2003. So many bangers from that year. Ignition by R. Kelly. Picture by Kid Rock/Sheryl Crow. P. Diddy’s party anthems Shake Ya Tailfeather and Bump, Bump, Bump. You could tune into The Apprentice to learn about business and enjoy Donald Trump’s timeless one-liners, or The West Wing to learn about the American presidency, maybe a little Chappelle’s Show for some laughs. Apparently it was also the first year we could all go hop on 4chan and Google Adsense for the first time. Anyway, it kinda makes you wonder what all those folks are up to now. I hope they’re well.

      • ZeffSyde@lemmy.world
        link
        fedilink
        arrow-up
        17
        arrow-down
        1
        ·
        1 day ago

        You just reminded me of why Y2K era nostalgia makes me ill.

        I was working part time in a mall and heard all this shit on repeat, and my co-workers were quoting The Chappelle Show because it was ok to be racist if a black guy said it first.

        • SynopsisTantilize@lemm.ee
          link
          fedilink
          arrow-up
          11
          ·
          1 day ago

          You’re being downvote because you are correct. The culture in 2000s America was trashy at best. The CIA psyop was in full effect (project mockingbird) and everyone was dancing to the rhythm of the patriotic drum, and being asleep at the wheel.

      • Noxy@pawb.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        20 hours ago

        I was a roller skating rink DJ when Shake Ya Tailfeather came out. It had the place so hyped up I had security tell me to cut the song off before it finished. People jumping up amd dancing on tables and shit. It was wild. That song was definitely a banger.

  • Psythik@lemm.ee
    link
    fedilink
    arrow-up
    25
    ·
    2 days ago

    I couldn’t even imagine what seeing PC games for the first time in 2025 feels like, after not seeing them since 2011.

    Do you think they were blown away? Or maybe disappointed that we still don’t have photorealistic graphics yet? I wish I could speak with this person so I could pick their brain.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      15
      ·
      1 day ago

      Arkham City, Crysis 2, Skyrim. It really hasn’t changed much. They’ve spent most of their time wanking over higher resolution and nicer reflections.

      For comparison there was 14 years between this:

      and this:

    • uniquethrowagay@feddit.org
      link
      fedilink
      arrow-up
      14
      ·
      1 day ago

      Honestly, the jump from 2011 to 2025 doesn’t seem nearly as steep as say 2000-2011. Sure games look better today but 2011 games still hold up. In 2000, 3d graphics were still new and most titles are considered unplayable now in terms of graphics and controls

      • Dicska@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        23 hours ago

        And 3D was the “AI” of those times. They had to bring it to EVERYTHING. Micro Machines (a top view toy car racer)? We’ll make it 3D, buy a card. Frogger? Yupp, Frogger 3D. They even tried to force 3D on poor Worms in 2003. I still prefer Worms World Party/Armageddon.

    • Noodle07@lemmy.world
      link
      fedilink
      arrow-up
      31
      arrow-down
      1
      ·
      2 days ago

      Dude we’re still playing classic wow and runescape, that guy hasn’t missed anything

      • Psythik@lemm.ee
        link
        fedilink
        arrow-up
        6
        ·
        2 days ago

        Fair, but I’m mostly interested in how they feel about modern AAA games, with their path tracing and HDR support and whatnot.

        • squaresinger@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          24 hours ago

          Tbh, I haven’t done time, but that’s still me.

          I upgraded from an old laptop to a 4070. I tried HDR and I don’t see a difference at all. I turned off all the lights, closed the blinds and turned the (hdr compatible, I checked) screen to max brightness. I don’t see a difference with HDR turned on or off.

          Next I tried path tracing. I could see a difference, but honestly, not much at all. Not nearly enough to warrant reduced FPS and certainly not enough to turn down other graphics settings to keep the FPS.

          To me, both are just buzzwords to get people to fork over more money.

          • Psythik@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            17 hours ago

            Seems to me that you got an early or cheaper HDR display, then. To me the difference is night and day.

            FWIW, HDR does its best work if you have a display that can do true blacks. If you don’t have an OLED, mini LED, or full array, you’re going to have a hard time noticing the difference, especially if you don’t know what you’re looking for. HDR works best in either extremely dark or bright scenes, so having a display with a near infinite contrast ratio is important.

            Here’s a hint for any display: Look at some HDR clouds while you toggle HDR on and off. You’ll definitely notice the difference there. Also check the teals. It’s less obvious but SDR displays can’t do a proper teal.

            • squaresinger@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              17 hours ago

              I tried it on a few OLED smartphones too, couldn’t see a difference.

              I tried it with some HDR demo videos, so I expected that these would show off the difference especially well, but I couldn’t see the difference at all.

              I’ll try it again with clouds and teals, but I don’t have a huge affinity for distinguishing minute colour differences in general (I’m not colour blind or anything, but it’s hard for me to differentiate between very similar colours), so that might play into it.

              • Lv_InSaNe_vL@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                16 hours ago

                HDR is more for showing the “depth” of an image, not as much the color gamut (how many colors it can show).

                HDR will help more with things like if you’re inside a building and looking out in a daylight scene. Youll be able to see more of both inside and outside the building. Of course it won’t make your monitor better, but assuming you have more than a basic display you should be able to see a difference.

  • ZkhqrD5o@lemmy.world
    link
    fedilink
    arrow-up
    54
    arrow-down
    5
    ·
    2 days ago

    Reminder: Temporal, proprietary upscalers are only made mandatory by devs, that actively refuse to make a properly functioning product.

    • lorty@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      Let’s not forget Nvidia created DLSS and Raytracing and directly helped devs integrate them into their games to create demand for their newer cards.

    • randomname@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      1 day ago

      Not sure why most games cant/dont do this, but i’ve seen Minecraft shaders use temporal upscaling exclusively on the clouds, reflections, and shadows. while using fxaa for the rest of the image.

      • Natanael@infosec.pub
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        Because you need to dig into the rendering engine to do that, and if you didn’t build it yourself you might not be able to do that easily

        • Lv_InSaNe_vL@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          16 hours ago

          Which would be easier if you were a dev making your own game than if you were making a mod for an existing one no?

          • Natanael@infosec.pub
            link
            fedilink
            arrow-up
            1
            ·
            3 hours ago

            Depends on the rendering engine architecture. If it processes stuff in layers already you can work with that more easily, same if you can insert rules for stuff like different shaders for different object types.

            If you’re dealing with a game where the rendering engine can’t do that it will be very complex regardless of how much source code you have.

    • Zangoose@lemmy.world
      link
      fedilink
      arrow-up
      41
      arrow-down
      1
      ·
      2 days ago

      Reminder: Most devs actually care about the things they make. This is a management/timeline problem, not a developer one.

      • ZkhqrD5o@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        2 days ago

        Well, I should have clarified by devs, I mean the entire companies, not the individuals. It’s a collective problem, not an individual one.

    • Psythik@lemm.ee
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      edit-2
      2 days ago

      Honestly I couldn’t care less, because DLSS/FSR looks better than native with AA at this point. It’s so good, that I even turn it on in games that I don’t need to.

      Quality comparable to supersampling, and I get a FPS boost too? Sign me the fuck up. It’s like magic.

      • ZkhqrD5o@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        IMO, I dislike them because in my experience they add input latency. But well, horses for courses.

        • Psythik@lemm.ee
          link
          fedilink
          arrow-up
          5
          ·
          1 day ago

          Frame Generation adds input lag, but I haven’t heard of any upscaling algorithms causing issues.

          • ZkhqrD5o@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            1 day ago

            Well, it’s subtle, but it’s still there in my experience, about 2ms. Which is bad if you’re already at the refresh rate of your monitor and you enable it, you’ll get 2ms of additional input latency, but if you are getting lower fps than your refresh rate, then you can cancel out the effect, because you’re getting more fps and hit your refresh rate. In my experience, because I’m very sensitive to that.

    • kadup@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      3
      ·
      2 days ago

      I’ll take DLSS over any other AA solution any day.

      We no longer use forward renderers, AA either looks like ass or comes with a massive performance cost, and it can’t fix noise from foliage, alphas, smoke, etc. DLSS fixes all three issues at once.

      • lorty@lemmy.ml
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        1 day ago

        Easy to not have artifacting when everything is a big smudge.

        • kadup@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 day ago

          Have you used DLSS or are you extrapolating FSR 1080p and believing it looks the same?

          • lorty@lemmy.ml
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            1 day ago

            Yes, I have. It’s also crap. The super agressive softening makes you feel like you are using a myopic camera. You could argue it’s poor implementation by developers, but it makes no difference to me.

      • ZkhqrD5o@lemmy.world
        link
        fedilink
        arrow-up
        21
        arrow-down
        1
        ·
        2 days ago

        Well Half-Life Alyx uses forward rendering and has a brilliant MSAA implementation. It is optimised because it needs to be. You cannot have this thing chugging along with 30Hz at full HD. You need 4K or more running at 90Hz or more. So they invested a good amount of time into making sure it functions properly before releasing it.

        Also, foliage really doesn’t need to be fixed, if it is done properly. Example, 20 year old games like Halo 3 or the Crysis games.

        I take issue with modern games because why the hell are they forgetting lessons of the past? Crysis and Halo 3 for example are 20 years old and they have better looking foliage than most modern games because they know what to do to avoid pop-in and noise. Yes, modern games have more foliage, because more VRAM, but older games have better looking foliage, due to the lack of wonky artifacts, in my opinion. And also, the proprietary TAA implementations, or TSR implementations, in my experience, add a ton of input latency, which makes the game feel worse. MSAA, because it uses geometry information to build AA, enhances image quality significantly and gives a better looking and more coherent picture than any other implementation of anti-aliasing, including proprietary TSR. Also, MSAA isn’t my religion, I realise that there are some aspects where TAA and TSR can be useful, but problem is, in modern games it gets abused because devs can then say “we’ll just do the absolute minimum, make sure the game executes on hardware at HD 30 Hz, and then we’ll just let the magic TSR and frame generation handle the rest”.

        Well, the problem with MSAA is that it needs to have good geometry in the first place if quad overdraw is complete shit because no one bothered to make tessellation or proper LOD models and let just some automatic tool handle everything without any supervision, then yes, it will be horrible. If devs say, “it makes my geometry timing horrible”, then we already know that their geometries are utter rubbish.

        Also a brilliant example of why I’m bothered by that is Payday 3 because it looks like a late PS3 game and runs like complete trash and has a massive CPU bottleneck, no matter what you do, even if you doctor around with the engine settings themselves.

        • SCmSTR@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          1 day ago

          This guy games.

          Also, if your game can’t look decent without any kind of DLSS or AA, you need to stop and fix that before relying on AA. Personally, I can’t stand the blurriness of any kind of AA, including DLSS, and almost always turn it off.

          Games are not still images and our brains are super good at motion interpolation between discrete pixels. To me, it always looks sharper and clearer and truer to life (I have very good vision irl, so blur is unwelcome, and TAA is just… Why would you want that outside of being an effect like being drunk or stunned?).

          Fuck TAA. 100%, forever.

          • ZkhqrD5o@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            1 day ago

            Amen. But in all honesty, TAA has its place for correcting some artifacts, with clouds for example, where blur really doesn’t matter. See the minecraft comment above, that’s interesting.

            Edit: typo.

            • SCmSTR@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              3
              ·
              20 hours ago

              Ah I found it. Interesting that it’s a partial/combo, but no thanks. I’ll absolutely try it, but I feel like I may have already seen stuff TAA partials that and it’s now just a smeary top-half of my camera/screen.

              I’ve seen so many games use TAA and I stg, every time, I wish I could turn it off but a lot of newer games you either outright can’t, it’s totally locked to any advanced graphics, or you can turn it off but a ton of stuff totally breaks, like foliage… Which is such a bizarre and frustrating problem.

        • kadup@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          2 days ago

          There’s a reason you had to fish for an exception to find a modern game with a forward rendering engine.

          • utopiah@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            1 day ago

            an exception

            FWIW it’s more than an exception IMHO it’s one of the very best game I played in my life. It’s more than a game, it’s an experience. I was in City 17.

          • ZkhqrD5o@lemmy.world
            link
            fedilink
            arrow-up
            7
            arrow-down
            1
            ·
            2 days ago

            Okay then, but it still works. It is still hard to claim that Half-Life Alyx runs bad or looks bad. I can only judge from my perspective as a customer. Why do we use these weird, wonky, hacky solutions for deferred rendering if the other one can look just as good, run as good, but doesn’t need any of these workarounds?

            • kadup@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              2 days ago

              I didn’t claim it doesn’t work. I claimed there’s a reason out of hundreds of releases, you have a singular example of a forward renderer.

              Which means TAA will keep being a problem, so my remark that DLSS is miles ahead applies to pretty much all games, even if once in a blue moon you find an exception.

    • Sylvartas@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      edit-2
      2 days ago

      And/or consumers insisting on playing in 4K because “big number” even though fill rate is a huge issue with modern games and you can barely tell the difference on most setups. Which would not be so bad if they also didn’t want ever increasing graphical fidelity and 120+ fps on top of that

      • Lv_InSaNe_vL@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        16 hours ago

        4k is absolutely an upgrade over 1440p. I have two of them (an LCD and an OLED) and I absolutely love them in every game I play. I will admit that I’m in the super minority and because of my work history I’ve spent a lot of time looking at a lot of displays so I’m more sensitive to various artifacts than the normal person. And in games I always prefer looks over resolution, it needs to drop down to like 40fps or lower for me to start changing settings.

        Basically, it was worth it for me but probably won’t be for you. OLED is a significantly actual upgrade. You should get an OLED it’ll change your life.

      • ZkhqrD5o@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        2 days ago

        In my opinion, the fidelity is getting worse than what we had 10 or 20 years ago. Because now we have noise, pop-in, and the temporal smearing because of proprietary TAA and TSA. Example being Payday 3 and this new Justice League or Batman game where you play with the four characters, Which I couldn’t bother to remember, Because everything about the game is way worse than the Arkham Knight game, which almost is 10 years old by now.

        • SCmSTR@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 day ago

          Man. I went back and played some native raster graphics games with no AA.

          It was like I took the drunk glasses off. Everything made sense again. The headache went away. I could see. Object permanence in the engine was insane… Because it all just was.

          In the late 00s and early 10s we had bad console ports. But before then, things were actually amazing. And after, when TB putting up a stink about options finally got traction, games were reaching a screaming peak and things were finally figuring it out. I really do believe that right now, we’re just in that awkward early-phase of a technology (like the latest 90s with the earliest 3D being really awkward) where people are trying new things and, regardless of rhetoric or stubbornness, will eventually have to face the cold, nuanced truth, no matter what:

          TAA is dung and should be flung into the sun.

          • ZkhqrD5o@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            1 day ago

            I hear you, but what do you mean by a transitional phase? Transitioning to what? I’m curious.

            • SCmSTR@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              2
              ·
              20 hours ago

              I think the technology (including things like TAA), may…sigh get better. Or it’ll find its proper place in the medium or stack because people will learn how to properly use it or work with it, and efficiently.

              Like in our other conversation about Minecraft clouds. Mayybe, over time (melodramatic reluctant pain) even things like TAA will find a place. Gamers will have discussed and aired their complaints, devs will also then take notice and then try new things, and standards and conventions will settle, HOPEFULLY in a direction that’s pleasing to everybody involved.

              I’ve seen it before, it’ll likely happen again. Just gotta keep talking about it and keeping the community aware and active with constructive conversations and criticisms. Also we need a new Total Biscuit.

  • Opisek@lemmy.world
    link
    fedilink
    arrow-up
    95
    arrow-down
    2
    ·
    3 days ago

    Fake resolution is what it is.

    And you know what it does have one use for me. I do like me my 4K monitors, but some games are simply too much for that. And rendering them at lower resolutions almost NEVER works without completely breaking full screen or something else. DLSS on the other hand pretends to be 4K and everything works again.

    • zurohki@aussie.zone
      link
      fedilink
      English
      arrow-up
      57
      arrow-down
      1
      ·
      3 days ago

      Fake resolution has it’s place, the problem is when Nvidia pressures reviewers to put its cards running a fake resolution against other cards running native resolution on benchmark charts.

    • Rai@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      21
      arrow-down
      1
      ·
      3 days ago

      I’ll take fake resolution for other framerates as long as it looks good enough! I play at 1440p though, because I don’t think mid-high level hardware is really there for 4k120+ yet.

      • thedirtyknapkin@lemmy.world
        link
        fedilink
        arrow-up
        19
        ·
        3 days ago

        i just wish it wasn’t the general direction the industry had decided to push things.

        it’s become the expected norm. it’s the performance metric games are optimized to hit now, and it’s far from prefect.

        i was just playing red dead 2 yesterday with dlss and i was legitimately struggling to do some things due to the artifacting. like there are some small missions and challenges that require you to find and shoot specific tiny birds with a bow, but dlss struggles with small things flying across a dynamic background. the birds would literally fade in and out of existence.

        same thing with trying to snipe distant heads. the little red fatal zone indicator would ghost like hell and fade in and out.

        like, it may be better than needing to drop your resolution, but it still kind of sucks sometimes.

        • Natanael@infosec.pub
          link
          fedilink
          arrow-up
          2
          ·
          1 day ago

          95% of those issues would disappear if there was a rendering hint layer for the games to use to mark which details needs to be rendered in higher quality, so the game engine would ensure that important details doesn’t disappear.

      • Zaphod@discuss.tchncs.de
        link
        fedilink
        arrow-up
        3
        ·
        2 days ago

        My 7900XT works reasonably for 4k well in most games - though admittedly I have to turn graphics down to Medium in alot of cases to get 100-ish fps with upscaling and frame-gen on quality settings. Except Cyberpunk, it ran really well with high settings.

        I’d guess in about 3 years it should be much better

    • Sustolic@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      3 days ago

      Rendering anything below native resolution is usually also blurry as hell, at least for me.

      Things like FSR is the only thing that saves my 6 year old 5700 XT from getting obliterated when using my 1440p monitor.

      • felsiq@lemmy.zip
        link
        fedilink
        arrow-up
        8
        ·
        3 days ago

        If you pick a resolution that requires fractional scaling (eg 1080p on your 1440p monitor) it’ll look real dogshit because it’s trying to represent one game pixel with like one and a half real ones along either direction. A resolution that would use integer scaling (ie 720p for your monitor) will just use two pixels in either direction to show one game one (like four pixels all showing the same thing), so it’ll be more pixellated but much less blurry and gross. FSR is the better solution most of the time, but if you did want to go below native again that’d make it a little less gross.

          • felsiq@lemmy.zip
            link
            fedilink
            arrow-up
            6
            ·
            edit-2
            2 days ago

            4K would go to 1080p for best results (for 3840x2160 screens rather than true 4K, but I’m assuming that’s what you’ve got), and should be much more playable on laptop hardware that way.
            Edit: oops didn’t see Beryl already answered this lol

  • IsThisAnAI@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    3 days ago

    I’ll take dlss, frame gen, and dynamic resolution over dropping a static resolution any day.

    • Opisek@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      Without a doubt, but the problem is, developers don’t care anymore to make their games run at all without DLSS. DLSS should not be the baseline.