Jump to content

Display lag

From Wikipedia, the free encyclopedia

Display lag is a phenomenon associated with most types of liquid crystal displays (LCDs) like smartphones and computers and nearly all types of high-definition televisions (HDTVs). It refers to latency, or lag between when the signal is sent to the display and when the display starts to show that signal. This lag time has been measured as high as 68 ms,[1] or the equivalent of 3-4 frames on a 60 Hz display. Display lag is not to be confused with pixel response time, which is the amount of time it takes for a pixel to change from one brightness value to another. Currently the majority of manufacturers quote the pixel response time, but neglect to report display lag. [citation needed]

Analog vs digital technology

[edit]

For older analog cathode ray tube (CRT) technology, display lag is nearly zero, due to the nature of the technology, which does not have the ability to store image data before display. The picture signal is minimally processed internally, simply for demodulation from a radio-frequency (RF) carrier wave (for televisions), and then splitting into separate signals for the red, green, and blue electron guns, and for the timing of the vertical and horizontal sync. Image adjustments typically involve reshaping the signal waveform but without storage, so the image is written to the screen as fast as it is received, with only nanoseconds of delay for the signal to traverse the wiring inside the device from input to the screen.

For modern digital signals, significant computer processing power and memory storage is needed to prepare an input signal for display. For either over-the-air or cable TV, the same analog demodulation techniques are used, but after that, then the signal is converted to digital data, which must be decompressed using the MPEG codec, and rendered into an image bitmap stored in a frame buffer.

For progressive scan display modes, the signal processing stops here, and the frame buffer is immediately written to the display device. In its simplest form, this processing may take several microseconds to occur.

For interlaced video, additional processing is frequently applied to deinterlace the image and make it seem to be clearer or more detailed than it actually is. This is done by storing several interlaced frames and then applying algorithms to determine areas of motion and stillness, and to either merge interlaced frames for smoothing or extrapolate where pixels are in motion, the resulting calculated frame buffer is then written to the display device.

De-interlacing imposes a delay that can be no shorter than the number of frames being stored for reference, plus an additional variable period for calculating the resulting extrapolated frame buffer; delays of 16-32ms are common.[2]

Causes of display lag

[edit]

While the pixel response time of the display is usually listed in the monitor's specifications, no manufacturers advertise the display lag of their displays, likely because the trend has been to increase display lag as manufacturers find more ways to process input at the display level before it is shown. Possible culprits are the processing overhead of HDCP, digital rights management (DRM), and also DSP techniques employed to reduce the effects of ghosting – and the cause may vary depending on the model of display. Investigations have been performed by several technology-related websites, some of which are listed at the bottom of this article.

LCD, plasma, and DLP displays, unlike CRTs, have a native resolution. That is, they have a fixed grid of pixels on the screen that show the image sharpest when running at the native resolution (so nothing has to be scaled full-size which blurs the image). In order to display non-native resolutions, such displays must use video scalers, which are built into most modern monitors. As an example, a display that has a native resolution of 1600x1200 being provided a signal of 640x480 must scale width and height by 2.5x to display the image provided by the computer on the native pixels. In order to do this, while producing as few artifacts as possible, advanced signal processing is required, which can be a source of introduced latency. Interlaced video signals such as 480i and 1080i require a deinterlacing step that adds lag. Anecdotally[original research?], display lag is significantly less when displays operate in native resolutions for a given LCD screen and in a progressive scanning mode. External devices have also been shown to reduce overall latency by providing faster image-space resizing algorithms than those present in the LCD screen.[citation needed] In practice this would stack the internal and external latencies.

Many LCDs also use a technology called "overdrive" which buffers several frames ahead and processes the image to reduce blurring and streaks left by ghosting. The effect is that everything is displayed on the screen several frames after it was transmitted by the video source.[3]

Testing for display lag

[edit]

Display lag can be measured using a test device such as the Video Signal Input Lag Tester. Despite its name, the device cannot independently measure input lag. It can only measure input lag and response time together.

Lacking a measurement device, measurement can be performed using a test display (the display being measured), a control display (usually a CRT) that would ideally have negligible display lag, a computer capable of mirroring an output to the two displays, stopwatch software, and a high-speed camera pointed at the two displays running the stopwatch program. The lag time is measured by taking a photograph of the displays running the stopwatch software, then subtracting the two times on the displays in the photograph. This method only measures the difference in display lag between two displays and cannot determine the absolute display lag of a single display. CRTs are preferable to use as a control display because their display lag is typically negligible. However, video mirroring does not guarantee that the same image will be sent to each display at the same point in time.

In the past it was seen as common knowledge that the results of this test were exact as they seemed to be easily reproducible, even when the displays were plugged into different ports and different cards, which suggested that the effect is attributable to the display and not the computer system. An in depth analysis that has been released on the German website Prad.de revealed that these assumptions have been wrong. Averaging measurements as described above lead to comparable results because they include the same amount of systematic errors. As seen on different monitor reviews the so determined values for the display lag for the very same monitor model differ by margins up to 16 ms or even more.

To minimize the effects of asynchronous display outputs (the points of time an image is transferred to each monitor is different or the actual used frequency for each monitor is different) a highly specialized software application called SMTT [4] or a very complex and expensive test environment has to be used.

Several approaches to measure display lag have been restarted in slightly changed ways but still reintroduced old problems, that have already been solved by the former mentioned SMTT. One such method involves connecting a laptop to an HDTV through a composite connection and run a timecode that shows on the laptop's screen and the HDTV simultaneously and recording both screens with a separate video recorder. When the video of both screens is paused, the difference in time shown on both displays have been interpreted as an estimation for the display lag.[5] Nevertheless, this is almost identical to the use of casual stopwatches on two monitors using a "clone view" monitor setup as it does not care about the missing synchronisation between the composite video signal and the display of the laptop's screen or the display lag of that screen or the detail that the vertical screen refresh of the two monitors is still asynchronous and not linked to each other. Even if V-sync is activated in the driver of the graphics card the video signals of the analog and the digital output will not be synchronized.[6] Therefore, it is impossible to use a single stop watch for display lag measurements, nevertheless if it is created by a timecode or a simple stopwatch application, as it will always cause an error of up to 16 ms or even more.

Effects of display lag on users

[edit]

Display lag contributes to the overall latency in the interface chain of the user's inputs (mouse, keyboard, etc.) to the graphics card to the monitor. Depending on the monitor, display lag times between 10-68 ms have been measured. However, the effects of the delay on the user depend on each user's own sensitivity to it.

Display lag is most noticeable in games (especially older video-game consoles), with different games affecting the perception of delay. For instance, in World of Warcraft's PvE, a slight input delay is not as critical compared to PvP, or to other games favoring quick reflexes like Counter-Strike. Rhythm-based games, such as Guitar Hero, also require exact timing; display lag will create a noticeable offset between the music and the on-screen prompts. Notably, many games of this type include an option that attempts to calibrate for display lag. Arguably, fighting games such as Street Fighter, Super Smash Bros. Melee and Tekken are the most-affected, since they may require move inputs within extremely tight event windows that sometimes only last 1 frame or 16.67 ms on the screen.

By assuming a Gaussian human response time to a particular in-game event, it becomes possible to discuss the effect of lag in terms of probabilities.[7] Given a lag-less display, a human has a certain probability to land his/her input within a window of frames. As video games operate on discrete frames, missing the last frame of the window even by 0.1 ms causes an input to be interpreted a full frame later. Because of this, any amount of lag will affect a human's ability to hit a particular timing window. The severity of this impact is a function of the position and variance of a human's response to a visual cue, the amount of lag introduced, and the size of the timing window. For example, given a very large window of 30 frames, a human would likely have a 99.99% chance of hitting this window. By introducing one frame of lag, the human's ability to hit the 30 frame window would likely remain in the 99.99% range (assuming the human is responding somewhere near the middle of the window). Given a smaller window of say 2 frames, however, the effect of lag becomes much more significant. Assuming the human's response is centered on the 2 frame window and the super-human has a 99.99% chance to hit the window, introducing a full frame of lag causes the success rate to drop all the way to about 50%.

If the game's controller produces additional feedback (rumble, the Wii Remote's speaker, etc.), then the display lag will cause this feedback to not accurately match up with the visuals on-screen, possibly causing extra disorientation (e.g. feeling the controller rumble a split second before a crash into a wall).

TV viewers can be affected as well. If a home theater receiver with external speakers is used, then the display lag causes the audio to be heard earlier than the picture is seen. "Early" audio is more jarring than "late" audio. Many home-theater receivers have a manual audio-delay adjustment which can be set to compensate for display latency.

Solutions

[edit]

Game mode

[edit]

Many televisions, scalers and other consumer-display devices now offer what is often called a "game mode" in which the extensive preprocessing responsible for additional lag is specifically sacrificed to decrease, but not eliminate, latency. While typically intended for videogame consoles, this feature is also useful for other interactive applications. Similar options have long been available on home audio hardware and modems for the same reason. Connection through VGA cable or component should eliminate perceivable input lag on many TVs even if they already have a game mode. Advanced post-processing is non existent on analog connection and the signal traverses without delay.

Renaming input

[edit]

A television may have a picture mode that reduces display lag for computers. Some Samsung and LG televisions automatically reduce lag for a specific input port if the user renames the port to "PC".[8]

Display lag versus response time

[edit]

LCD screens with a high response-time value often do not give satisfactory experience when viewing fast-moving images (they often leave streaks or blur; called ghosting). But an LCD screen with both high response time and significant display lag is unsuitable for playing fast-paced computer games or performing fast high-accuracy operations on the screen, due to the mouse cursor lagging behind.

See also

[edit]

References

[edit]
  1. ^ "Face to Face cameras, printers, ... - DigitalVersus". Retrieved 2008-03-07.
  2. ^ "HCI Design: Monitor Input Lag database". hcidesign.com. Retrieved 2022-05-03.
  3. ^ "The Dark Side of Overdrive | bit-tech.net". bit-tech.net. Retrieved 2024-09-06.
  4. ^ Thiemann, Thomas (2020-05-22). "smtt.thomasthiemann.com - SMTT Website". Thomas Thiemann. Retrieved 2022-05-03.
  5. ^ "Google Sites: Sign-in".
  6. ^ "Untersuchung des Testverfahrens einer Input-Lag-Messung". 27 August 2009.
  7. ^ "Melee It on Me | Ugh, This TV Lags!". Archived from the original on 2015-02-20. Retrieved 2015-02-22.
  8. ^ Morrison, Geoffrey (2015-07-21). "How to use your 4K TV as a monitor". CNET. Retrieved 2020-01-31.
[edit]