In the FAQ you can read about TFT myths, but the FAQ isn't really killing any myths, because the facts are incorrect.
The biggest myth is the refresh rate. CRT monitors worked with refresh rates because that was how the technology functioned, with the electron gun refreshing the screen rapidly, to cause a steady image to remain on the glass tube in front of you. However, with LCD monitors, this no longer applies. But I hear you say that the setting is still available in the Monitor preferences? Correct, but this is a way to fool your video card into thinking that your monitor isn't dead, and to continue functioning normally. If the LCD monitor told the video card told the video card there is no refresh rate, the video card would simply assume no monitor exists. Once again, there is no refresh rate on the LCD monitor, you would not notice a difference between 60hz and 200hz.
This is actually way wrong. A TFT-panel updates your screen in the same way as the CRT did. You can see difference on 60Hz and 75Hz on a TFT.
Before I begin to explain why, I need to say that it isn't the backlight and the flicker this is about. And yes in
theory a TFT could update all pixels individually as the quoted text is informing about.
So what is it about? It is about how much information that gets transferred to your monitor, how fast it can refresh the image on screen.
VGA
First, let's look at the old TFT-panels as you may know they used VGA-connectors as the CRT did and had a built in Analog Digital Converter.
The information inside the cable was running at a given refresh rate, normally 60Hz but most TFT's could do 75Hz too (this is for DVI too).
What happends here is that the ADC is converting your signal and showing it on your TFT. The higher refresh rate, the more information can be shown.
DVI
Okay, let me continue, the case is actually the same with the digital DVI-cables. Even though you're running a DVI cable the signal is still using a refresh rate. The colors for each pixel is represented digitally but the whole screen is still updated in the same classic way as the vga-cable did.
Left to right, top to bottom.
For not making a flame war, check yourself at wikipedia;
http://en.wikipedia.org/wiki/Dvi#Digital
What does this really mean compared to a CRT
Well firstly it means that no TFT on the market can match a CRT running at anything above 75Hz refresh rate since no TFT accepts a faster refresh rate. Secondly it means that the full potential of the TFT isn't used at all.
Why use a refresh rate on a digital panel?
There are two main reasons, bandwidth and backward compability.
Bandwidth
Say for a minute that you had per pixel update and unlimited updates per second. How much throughput would the dvi-cable need? Infinite!
To just make it in raw numbers to you, say that you are running 1280x1024@32bit, this would turn out to 3.75MB per frame, now imagine you are moving around windows on your desktop, this could easily be 2000fps on a modern computer, making the poor dvi-cable output more than 1GB of data every second. Let's check out those DVI specs again;
http://en.wikipedia.org/wiki/Dvi#Digital
3.75Gbit of throughput, that is about 475MB/s.
Backward compability
In the old days when all monitors were CRT, applications (mainly games) heavily relied on timing with the refresh rate. A programmer could know exactly when the next frame were going to be on screen and time it with the output of the graphic card. When this became a standard it was called v-sync.
What is the point of syncing? With moving content on the screen you want every frame to be fully drawn before showing the next. V-sync tells the graphic card when it is okay to put out next frame to the monitor. Without a refresh rate to the monitor this wouldn't be possible.
So why does this matter?
For most users this doesn't matter at all, but for picky gamers this make a difference in speed and shown images per second.
It also applies how to count the TFT-speed. A TFT is
at best running at the same speed as an CRT plus the response of the panel.
I don't believe the facts of DVI and VGA cables or your calculations, how do I prove it to myself?
A. Tearing
A monitor that could update per pixel data without any refresh rates wouldn't get tearing, play some games with v-sync forced off and look for tearings and vertical lines.
B. Mouse pointers
Just pull your mouse around in circles on your desktop, try to count how many you can see at the same time. Now increase the refresh rate to 75Hz. You will see less pointers, or rather the updates will be so fast that you can't see as many. If your eye isn't fast, make a friend take a photo on the screen.
To really prove this test, try with a crt and low resolution, this is best seen at 100Hz or 120Hz, there will be ONE pointer.
C. Low refresh rates
Some DVI-devices support lower refresh rates, for example TFT-based beamers and some regular panels. Install powerstrip, force a low refresh rate like 40Hz. Play some fast game like Quake 3 or similar and switch back to 60Hz, you will see the difference.
Lastly I would like to say that this isn't complaints from my side, I really like this site and just want to contribute.
When you write facts you have a great responsibility, don't write "facts" without any sources of information. We need to kill myths!