And/or consumers insisting on playing in 4K because “big number” even though fill rate is a huge issue with modern games and you can barely tell the difference on most setups. Which would not be so bad if they also didn’t want ever increasing graphical fidelity and 120+ fps on top of that
4k is absolutely an upgrade over 1440p. I have two of them (an LCD and an OLED) and I absolutely love them in every game I play. I will admit that I’m in the super minority and because of my work history I’ve spent a lot of time looking at a lot of displays so I’m more sensitive to various artifacts than the normal person. And in games I always prefer looks over resolution, it needs to drop down to like 40fps or lower for me to start changing settings.
Basically, it was worth it for me but probably won’t be for you. OLED is a significantly actual upgrade. You should get an OLED it’ll change your life.
In my opinion, the fidelity is getting worse than what we had 10 or 20 years ago. Because now we have noise, pop-in, and the temporal smearing because of proprietary TAA and TSA. Example being Payday 3 and this new Justice League or Batman game where you play with the four characters, Which I couldn’t bother to remember, Because everything about the game is way worse than the Arkham Knight game, which almost is 10 years old by now.
Man. I went back and played some native raster graphics games with no AA.
It was like I took the drunk glasses off. Everything made sense again. The headache went away. I could see. Object permanence in the engine was insane… Because it all just was.
In the late 00s and early 10s we had bad console ports. But before then, things were actually amazing. And after, when TB putting up a stink about options finally got traction, games were reaching a screaming peak and things were finally figuring it out. I really do believe that right now, we’re just in that awkward early-phase of a technology (like the latest 90s with the earliest 3D being really awkward) where people are trying new things and, regardless of rhetoric or stubbornness, will eventually have to face the cold, nuanced truth, no matter what:
I think the technology (including things like TAA), may…sigh get better. Or it’ll find its proper place in the medium or stack because people will learn how to properly use it or work with it, and efficiently.
Like in our other conversation about Minecraft clouds. Mayybe, over time (melodramatic reluctant pain) even things like TAA will find a place. Gamers will have discussed and aired their complaints, devs will also then take notice and then try new things, and standards and conventions will settle, HOPEFULLY in a direction that’s pleasing to everybody involved.
I’ve seen it before, it’ll likely happen again. Just gotta keep talking about it and keeping the community aware and active with constructive conversations and criticisms. Also we need a new Total Biscuit.
And/or consumers insisting on playing in 4K because “big number” even though fill rate is a huge issue with modern games and you can barely tell the difference on most setups. Which would not be so bad if they also didn’t want ever increasing graphical fidelity and 120+ fps on top of that
4k is absolutely an upgrade over 1440p. I have two of them (an LCD and an OLED) and I absolutely love them in every game I play. I will admit that I’m in the super minority and because of my work history I’ve spent a lot of time looking at a lot of displays so I’m more sensitive to various artifacts than the normal person. And in games I always prefer looks over resolution, it needs to drop down to like 40fps or lower for me to start changing settings.
Basically, it was worth it for me but probably won’t be for you. OLED is a significantly actual upgrade. You should get an OLED it’ll change your life.
In my opinion, the fidelity is getting worse than what we had 10 or 20 years ago. Because now we have noise, pop-in, and the temporal smearing because of proprietary TAA and TSA. Example being Payday 3 and this new Justice League or Batman game where you play with the four characters, Which I couldn’t bother to remember, Because everything about the game is way worse than the Arkham Knight game, which almost is 10 years old by now.
Man. I went back and played some native raster graphics games with no AA.
It was like I took the drunk glasses off. Everything made sense again. The headache went away. I could see. Object permanence in the engine was insane… Because it all just was.
In the late 00s and early 10s we had bad console ports. But before then, things were actually amazing. And after, when TB putting up a stink about options finally got traction, games were reaching a screaming peak and things were finally figuring it out. I really do believe that right now, we’re just in that awkward early-phase of a technology (like the latest 90s with the earliest 3D being really awkward) where people are trying new things and, regardless of rhetoric or stubbornness, will eventually have to face the cold, nuanced truth, no matter what:
TAA is dung and should be flung into the sun.
I hear you, but what do you mean by a transitional phase? Transitioning to what? I’m curious.
I think the technology (including things like TAA), may…sigh get better. Or it’ll find its proper place in the medium or stack because people will learn how to properly use it or work with it, and efficiently.
Like in our other conversation about Minecraft clouds. Mayybe, over time (melodramatic reluctant pain) even things like TAA will find a place. Gamers will have discussed and aired their complaints, devs will also then take notice and then try new things, and standards and conventions will settle, HOPEFULLY in a direction that’s pleasing to everybody involved.
I’ve seen it before, it’ll likely happen again. Just gotta keep talking about it and keeping the community aware and active with constructive conversations and criticisms. Also we need a new Total Biscuit.