I was trying out FSR4 on my RX 6800 XT, Fedora 42. Works really well and it easily beats FSR3 in visuals even on Performance. It does have a significant performance hit vs FSR3 though but it still works out to be a bit faster than a native rendering on Quality.

  • Victor@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    4 days ago

    It seems that the input lag is more perceived, rather than actually experienced, from what I understand. Like if you go from 30 to 120 fps, you expect the input lag to decrease, but since it stays the same (or slightly worse), you perceive it to be much more severe.

    • DarkAri@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      The frame rate isnt going from 30 to 120 FPS. It’s actually going from 30 to like 20. The rendered frames are different then the CPU frames which handles the game loops, (physics, input, simulation, etc)

        • DarkAri@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          Generated frames are created using a neural network, they have nothing to do with the actual game scripts and game loop and input polling and stuff. FSR does generate frames to interpolate between real frames but things like physics and input are not being generated as well. It’s only visual. I guess maybe you have to have some basic knowledge about how a computer program and game engine works to understand this.

          Basically the CPU steps through the simulation in steps. When you use frame gen, if it lowers the actual frame rate, then the CPU is making less loops per second over everything, like the physics updates, input polling(capturing key presses and mouse events), and other stuff like this.

          • Victor@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            Oh yeah, now I remember why there’s more input lag with frame interpolation turned on. Taking a shot right now and now it pops into my head.

            Anyway, it’s because while the frame interpolation adds more frames per second, the “I-frames”—or real frames—you’re seeing are lagging behind one I-frame. This is because it can’t start showing you interpolated frames until it has two frames it can interpolate between.

            So you won’t start seeing I-frame N-1 until I-frame N (the latest I-frame) has been generated, thus creating extra input lag.

            Someone correct me if I’m wrong, I’m supposed to be asleep…

            • DarkAri@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 day ago

              It’s more so that the actual FPS is lower when using FSR in many cases. The GPU frame rate doesn’t matter in terms of input lag and stuff, it’s all about how many time the CPU can loop through the game logic per second.

              So basically when you move 10 steps forward in a game, the CPU is running tons of code that take the time elapsed since the previous frame and interpolates where the player should be this frame. This is Delta time, (change in time between this frame and last) it’s multiplied by stuff moving to give fluid movement with a variable frame rate. This is why older games would slow down if the frame rate dropped and new games will still calculate the passage of time correctly, even if you only have 15 FPS.

              The fake frames have nothing to do with the game engine or logic, they are deep faked frames that are created with a neural network to fill in between real frames. This does give you something very close to extra frames on the GPU, but there is often a performance hit on the real frames since it’s a heavy process. The CPU has to stay synced to the GPUs real frames since some logic is CPU bound, like physics, creating certain buffers, all kinds of stuff. If the real frame rate of the GPU is lower, it bottlenecks the CPU since it’s also involved to a smaller degree, in rendering real frames. (Preparing data, sending it to the GPU, certain operations which are faster on the CPU that involve rendering like maybe using MMX or other CPU extensions.

              So basically the less real frames you have, the longer the wait between when you game engine can detect mouse and keyboard events and update the game world, even if you are getting 2-3 times the frame rate with generated frames.

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 days ago

        Very much so. The very reason why we want more fps is to have less input lag, that’s my personal take anyway. That’s the only reason why I have a beefy computer, so the game can respond quicker (and give me feedback quicker as well).