ViewSonic Library > Tech > Compared > Frame Rate vs. Refresh Rate: What’s the Difference?

Frame Rate vs. Refresh Rate: What’s the Difference?

One of the biggest problems for people searching for a high-quality monitor is the frame rate vs refresh rate issue. Essentially, these two common terms describe a similar concept, which can make it difficult to fully understand how they actually differ from one another. This can, in turn, make the search for a quality monitor more confusing than it actually needs to be.

Keep reading for a detailed breakdown of the difference between these concepts.

Frame rate and refresh rate are terms you are likely to encounter when exploring computer monitor options and while the topic of frame rate vs refresh rate has been briefly covered in a previous article in the ViewSonic library, it is important to explore these two concepts in more detail and fully explain how they differ from one another.

In this article, we will offer definitions of both frame rate and refresh rate to help you understand the concepts, along with an explanation of how the two combine to create the image quality you actually see on your monitor.

Frame Rate: A Definition

First, it is helpful to provide a basic definition of frame rate. Movement on a monitor is shown by displaying a number of consecutive still images, which are called ‘frames’ and this is true regardless of whether the movement being displayed is a film, a television show, a video game, or a mouse cursor moving on a computer screen. Of course, these individual frames change many times per second, but the premise is similar to a flipbook animation.

With this in mind, frame rate is the rate at which new frames are displayed. It is usually expressed as frames per second, or fps for short. Essentially, the higher the frame rate, the more frames are displayed per second and that then results in smoother, more realistic movement, as opposed to more staggered or stuttered movement.

One thing that is important to stress is that frame rate is not determined by the monitor itself. Instead, your frame rate will be determined by a combination of the software or media you are using, your central processing unit (CPU), and your graphics card. However, that is not to say that frame rate is irrelevant for monitors, as we will go on to later.

refresh rate vs frame rate what's the difference

Refresh Rate: A Definition

Unlike frame rate, the refresh rate concept is directly associated with monitors or other forms of display hardware. Put simply, it describes the number of times a monitor’s display is refreshed and this is usually expressed in hertz (Hz).

As a basic rule, the higher the refresh rate on a monitor, the better. Higher refresh rates will tend to produce much smoother looking movement and this can be visible even in terms of how smoothly the mouse cursor moves around the screen, but it is especially noticeable when it comes to more demanding usage, such as gaming or esports.

For modern monitors, 60 Hz is the bare minimum required, but even this has sometimes been associated with problems like eye fatigue and eye strain, so 75 Hz might be a better starting point. A serious gaming monitor will usually offer a refresh rate of 120 Hz or more, with anything above this representing a potential option for almost all uses.

refresh rate vs frame rate what's the difference

Frame Rate vs Refresh Rate

Although only refresh rate is directly associated with computer monitors, both refresh rate and frame rate impact upon what you see on the screen. If you have a monitor with a high refresh rate, but your graphics card and processor are only able to produce a low frame rate, you are unlikely to experience the full benefit, and vice versa.

With that being said, not all media requires a high frame rate. For instance, historic and even modern movies have typically been filmed at 24 fps, while live sport is often recorded at 30 fps. The higher frame rates of 60 fps and above are mostly relevant for displaying content that is fast-paced and where precision is required, which is true of gaming.

When frame rate and refresh rate are out of sync, a problem called screen tearing can occur and this is essentially where a single screen refresh displays data from multiple frames at once. One of the ways you can help to balance frame rate and refresh rate and prevent this is through using a display technology like G-Sync or FreeSync.

refresh rate vs frame rate what's the difference, monitor setup

Final Thoughts

Ultimately, the single biggest difference between frame rate and refresh rate is the point of origin. Frame rate is produced by a combination of your graphics card and your processor, so it is essentially the number of frames a system is able to produce in a second, whereas refresh rate is the rate at which a monitor is able to completely refresh its display. It is important for both to be in sync in order to avoid visual defects such as screen tearing.