My background is in telecommunications (the technical side of video production), so I know that 30fps is (or was?) considered the standard for a lot of video. TV and movies don’t seem choppy when I watch them, so why does doubling the frame rate seem to matter so much when it comes to games? Reviewers mention it constantly, and I don’t understand why.

  • jsdz@lemmy.ml
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    edit-2
    10 months ago

    It comes directly from television. Early home PCs used televisions for displays, and by the 1980s TVs were generally capable of 60 fps (or 50 for regions that used PAL) so that’s what the computers generated. Everyone got used to it. And of course like everyone else said you don’t want to be adding more latency in games by not keeping up with that basic standard.

    • SwingingTheLamp@midwest.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      Technically, NTSC video does 60 fields per second (PAL does 50), because the video signal is interlaced. That is, the beam sweeps from top to bottom of the screen 60 times per second, but it only draws half of the horizontal scan lines per sweep, alternating between the odd lines field and the even lines field. That’s why we considered “full motion video” to be 30 frames per second back in the day. The alternating fields did make movement appear smoother, but the clarity wasn’t great.

      VGA originally doubled the 15.75kHz horizontal clock rate of NTSC to 31.5kHz, so that the beam was fast enough to draw all of the lines in one vertical sweep, so it can do 60 frames per second with a 60Hz refresh rate. Prior to that, a lot of games were just 30fps, because interlaced video tended to flicker on bitmapped graphics.

      • jsdz@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        10 months ago

        VGA might’ve done that to get better resolution at 60 Hz, but I’m pretty sure earlier systems including CGA and the Amiga did 60 fps non-interlaced video at lower resolutions. At least the Amiga also had a higher-resolution interlaced video mode, but it was mostly used for displaying impressive-looking static images.