What's new

Screen refresh rate?

Ordnas

Member
Just messing around in displau settings and found that I can change the screen refresh rate from 59 to 60. Will this make any difference. I thought 60 hertz was always better?
 

leeshor

Well-Known Member
Actually most people don't want 60 as a refresh rate, especially in an office with fluorescent lighting, as it causes a harmonic flutter with the 60Hz of the bulb, at lest where the electric frequency is 60.
 

GoodBytes

Well-Known Member
Heuu no Lesshor, doesn't work that way.


@Ordnas, If you check again, you'll notice it is back at 59Hz! :)
The reason why you have both, and why it reverts back to 59Hz, is quite interesting. The LCD panel is actually 59.94Hz, which Windows 7 and up considers "TV-Compatible", which is needed for perfect playback for Media Center and Windows Media Player. It has to do with Blu-ray and DVD playback being "strange" values of 23.976fps (or 23.976Hz to be technically exact). You also have different fps in movies option all with a decimal point. Windows creates a fake 60Hz option for games who calls the list of screen resolution to populate the option for the user by asking Windows "Give me only 60Hz supported resolution or above). As 59.94Hz isn't 60Hz, the native resolution would be missing on the list. As this is a fake option for compatibility purposes, that is why every time you'll set 60Hz, it reverts back to 59Hz (which really signifies 59.94Hz).

You can read more about it here: https://support.microsoft.com/en-us/kb/2006076/


Now to correct @leeshor:
LCD monitor image is steady on the screen. It doesn't flicker. It refreshes each line of LCD panel from top to bottom. If you set the LCD panel to say 40 or even down to 30Hz which some laptop support as aggressive power saving feature available to consumers. It will still no flicker. A flicker of 60Hz is clearly visible to the eye. The reason why you don't see fluorescence light bulb flicker despite the light bulb cannon firing at 60Hz is because the light tube has a layer of phosphor. That is why the 'tubes' are not transparent and always white. You have different grades of phosphor and how the manufacture applies it. If the manufacture uses low quality phosphor and a thin layer, it will flicker a lot, as the phosphor won't properly retain light. CRT monitors (the big tube monitor of yesteryears), used to flicker as the monitor were costly back in the days and the consumer didn't want to spend even more for a flicker free experience, but if you spend more you could get easily a 85Hz CRT monitor with high quality and thick layer phosphor applied and give you a flicker free experience. Not even scan lines visible if you point a camera at it (an effect that CRT monitor had due to the difference of refresh rate between the monitor and the video camera)

Picking any refresh rate doesn't play with the flickering. The reason why you "see', or feel (should I say) flickering on a monitor, is that the monitor uses what is called a PWM or a Pulse Width Modulation if you prefer to control the back light of the LCD panel (as LCD panels don't product light, they filter light), instead of using a dimming circuit. The way PWM controlled back light works is very simple: Based on the brightness the panel is it, it flickers on and off the LEDs lights which forms the back light at a certain rate (can vary from 120Hz to thousands of Hz), while a dimming circuit would keep the intensity of the LED at a fix level.

Why PWMs are used instead of dimming circuit? Well, monitors with dimming circuit exists. In fact, they are many these days. But the reason why PWM driven back light LCD monitor are common are for 2 reasons:
-> Cost less (hence why they are common on budget and entry level LCD monitors)
-> Consumer less power (great for mobile computers)

Dimming circuit driven back light cost a lot more, takes more space, and consumes more power to operate.

Some monitor or LCD panels uses a hybrid model. Under low brightness settings it uses PWM, and med to high it uses dimming circuit. However, this setup is not common due to cost (cost both technology and multiplexers to do the switch)

People that are sensitive to PWM driven back light monitor commonly can't see the flickering but feel it. It can cause headache after prolong usage, red eyes, and trouble viewing the screen despite being able to read text of the same size on paper (of course, if you have these side-effects consult your doctor to be sure that it is not something else health wise. I am not a physician or any health-care professional). For such people, like myself, using the monitor at higher brightness can help reduce this effect as the PWM flicker goes faster (max brightness, set the PWM to always on state to reach max brightness, so no flickering), but purchasing a monitor that uses a dimming circuit instead is the real proper solution.

Another point not covered in this discussion, is that PWM is also used to trick the human eye in seeing a faster response time on the monitor. But this is more for "gaming" monitors, and is not part of the scope of this discussion.

To know what the monitor has, it is best to look at in depth review sites on monitors. Sites like AnandTech or TFTCentral.

For full details on PWM driven back light monitor, TFTCentral has an excellent article:
http://www.tftcentral.co.uk/articles/pulse_width_modulation.htm

Most Dell Ultrasharp and many Professional series monitor don't use a PWM. Check in depth reviews to know.
Here is an example of an in depth review monitor. See how in depth they cover the monitor.
http://www.tftcentral.co.uk/reviews/dell_u2515h.htm
They go in such details that basically all monitor sucks, and it is you to see which downside affect you less :)

To learn more about LCD technology specification, here is a great forum post:
http://linustechtips.com/main/topic/278610-display-technology-faqmythbuster/

To understand the basics of how an average White-LED back light LCD monitor works, here is a great YouTube short video:

Another point not talked about that is related to a different kind of flickering is 6-bit panels. "# of bits" is refereed as the number of colors the LCD panel can natively produce. The human eye can see billions and potentially trillions of colors (studies that says otherwise are poorly made and makes no sense. If you set any picture of the limited number of colors mentioned in a photo editor on these research you'll clearly see that we don't see the world in such limited colors spectrum, and that indeed we see things perfectly smoothly, including slight gradations of colors such as shadows, from one color to another one close by, while on a monitor of 16.7 million colors, you see stepping confirming. Anyway, that is a different discussion, and already long parentheses). All images, videos, movies, games, etc. encoded in the consumer world are for 16.7 million colors (or 16,777,216 colors to be exact). We use this limited amount due to a variety of reasons: cost, processing power cost, size, and that it is pretty darn good to start with. Now, it has become (well since a very long time: standard)

So what is up with 6-bit and that side background I just read through? Well before I start, there is more background to know. :/

Something to keep in mind is that when we talk about number of bits on a panel, we always assume "per channel". Red, green and blue are channels. Red, green and blue are also the primary colors which is used to form all the colors in combinations. This called the RGB model. It is an efficient way of displaying colors on a screen.

8-bits color is 2^8 and that is 256 colors. That is 256 colors for red, 256 for green and another 256 for the blue color. As we mix colors, we multiply them. So, 256 x 256 x 256 = 16,777,216 or if you prefer ~16.7 million colors.

Having a monitor panel that produce 8-bits colors per channel natively is costly. So a cost cutting measure is to design a monitor that can produce only 6-bits color per channel. That is 2^6 = 64 for each primary color, so 64 x 64 x 64 = 262,144. Yes, ONLY 262,144 colors. "But but but the box says 16.7 million colors on the most crappy monitor money can buy, what are you talking CrazyBytes!", you say.

Well, the way they do it is that the monitor has a circuitry called "Frame Rate Control", as it control the frame rate of color selection to emulate missing colors... In other words that makes more sense: When the circuitry detects a color that it can't natively produce, it uses a fast algorithm method to determine 2 colors to switch between at rapid rate to trick your eyes in seeing the correct color, emulating the missing color, reaching 16.7 million colors. Pretty clever, but form flickering. Although, this is very unlikely to cause problem to a person as the flickering is too fast, but it is none the less a form of flicker, which may or may not contribute in increasing the problem of PWM sensitive people.

6-bit panel was introduces in the days of affordable LCD monitors. 16.7 million colors CRT monitors were all true 8-bit monitors.

For the best colors, buy a monitor that is referred as a true 8-bit panel monitor. How to know if the monitor panel is a true 8-bit panel. You guest it: In depth monitor review. An easy trick is to look for a monitor that advertise that is can produce 1.07 billion colors (that is 10-bits colors). At the consumer market, these are true 8-bit panels, with a FRC circuit to emulate the extra bits to reach 10-bit colors. However, it must be noted that they are many monitor that uses a true 8-bit panel and don't advertise 1.07 billion colors, as they don't have the circuitry to emulate the extra review. But it s a quick and easy way to know. The best way is to check in depth monitor review sites.
 

GoodBytes

Well-Known Member
Fun fact:
If you look at compact fluorescence light bulb that have been used a lot or a fluorescent lighting, you'll see that they turn black at some areas. That is the phosphor layer that burned over time, hence why they dim as the light 'bulb' age. :)
 

leeshor

Well-Known Member
I'm old school. I can't tell you how many times a 60Hz refresh has caused issues with harmonic flicker in the past.
 

GoodBytes

Well-Known Member
haha! Yea. I do recall being it a problem. As I said, I am sensitive to PWM driven monitor, so I did all my research on this.And at the time resources was not available like today. Had to talk to professors and engineers on LCD technology (well at university) and read some technical stuff in trying to understand how things work and why they work that way.
 
Top