If you’re ever been staring blearily at a computer screen, blinked your eyes sleepily as you yawned and noticed the screen momentarily flickering weirdly, you may have caught a glimpse of what’s known as the “Refresh rate.” On screens, refresh rate refers to the number of times an image on that screen is redrawn per second. Because most modern monitors and TVs have a refresh rate of at least 60Hz, you’re unlikely to notice flickering or suffer as much eye strain as you might have on older screens with lower refresh rates.
However, gamers and videophiles who notice details such as motion blur and tearing (when the refresh rate can’t keep up with the video source’s output), will want to opt for TVs or monitors with a higher refresh rate. For fast-paced games, especially, a refresh rate of 144Hz or higher can yield smoother-looking motion that can take advantage of the frame rates put out by faster video graphics hardware.
On HDTVs, a refresh rate of 240Hz might be overkill for more casual viewers, but if you’re dropping a lot of money on, say, a new 4K TV set, it wouldn’t hurt to ask for a refresh-rate comparison when you’re browsing the showroom at your local electronics store.
Every week, we’ll define a tech term, offer a timely tip or answer questions about technology from readers. Email email@example.com with questions or topic suggestions.
News on Open Source is free and unlimited. Access to the rest of 512tech.com comes with an American-Statesman digital subscription, which also includes myStatesman.com and the ePaper edition. Subscribe at statesman.com/subscribe.