lol. has anyone found ways to optimize starfield for their pc, like reducing stuttering, FPS drops, etc?

  • redfellow@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 months ago

    They didn’t optimize it for consoles either. Series X has equivalent of 3060 RTX graphical grunt, yet it’s capped to 30fps and looks worse than most other AAA games that have variable framerates up to 120fps. Todd says they went for fidelity. Has he played any recent titles? The game looks like crap compared to many games from past few years, and requires more power.

    The real reason behind everything is the shit they call Creation Engine. An outdated hot mess of an engine that’s technically behind pretty much everything the competition is using. It’s beyond me why they’ve not scrapped it - it should have been done after FO4 already.

    • Huschke@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      And don’t forget the constant loading screens. A game that has so many of them shouldn’t look this bad and run this poorly.

    • PatFusty@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      Correct me if Im wrong but dont they limit frametimes so they can reduce tv stuttering? NTSC standard for TVs is 29.94 or 59.94 fps. I assume they chose the 30fps so it can be used more widely and if its scaled to 60 it would just increase frametime lag. Again, im not sure.

      Also, comparing CE2 to CE1 is like comparing UE5 to UE4. Also, i dont remember but doesnt starfield use the havok engine for animations?

      Edit: rather than downvote just tell me where I am wrong

      • Nerdulous@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Not to put too fine of a point in it but you’re wrong because your understanding of frame generation and displays is slightly flawed.

        Firstly most people’s displays, whether it be a TV or a monitor, are at least minimally capable of 60hz which it seems you correctly assumed. With that said most TVs and monitors aren’t capable of what’s called variable refresh rate. VRR allows the display to match however many frames your graphics card is able to put out instead of the graphics card having to match your display’s refresh rate. This eliminates screen tearing and allows you to get the best frame times at your disposal as the frame is generally created and then immediately displayed.

        The part you might be mistaken about from my understanding is the frame time lag. Frame time is an inverse of FPS. The more frames generated per second the less time in between the frames. Now under circumstances where there is no VRR and the frame rate does not align with a displays native rate there can be frame misalignment. This occurs when the monitor is expecting a frame that is not yet ready. It’ll use the previous frame or part of it until a new frame becomes available to be displayed. This can result in screen tearing or stuttering and yes in some cases this can add additional delay in between frames. In general though a >30 FPS framerate will feel smoother on a 60hz display than a locked 30 FPS because you’re guaranteed to have every frame displayed twice.

        • PatFusty@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Thanks, i was recently reading about monitor interlacing and i must have jumbled it all up.