My background is in telecommunications (the technical side of video production), so I know that 30fps is (or was?) considered the standard for a lot of video. TV and movies don’t seem choppy when I watch them, so why does doubling the frame rate seem to matter so much when it comes to games? Reviewers mention it constantly, and I don’t understand why.

  • Kale@lemmy.zip
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    A decade ago I had a little extra money and chose to buy a 144 hz gaming monitor and video card. I don’t have great eyesight nor do I play games that require twitch reflexes, but at that time 144 hz frame rate (and configuring the game to be >100 fps) was very noticable. I’d much rather play 1080 at >100 fps rather than 4k at 60 fps or below.

    This may be different between people. I don’t believe I have great eyesight, depth perception, color perception, etc, but I am really sensitive to motion. I built my second computer (AMD Athlon 64 bit I think?) and spend a significant sum on a CRT that had higher refresh rates. I can’t use a CRT at 60Hz. I perceive the flicker and I get a headache after about 20 minutes. I couldn’t use Linux on that computer (I was stuck at 60 hz on that kernel/video driver) until I saved up even more to buy an LCD monitor. I can’t perceive a 60 hz flicker on an LCD, and 60Hz is fine for work.

    But for gaming, high refresh rate is noticable, even for someone that normally doesn’t notice visual stuff, like me.