Interlace
From Free net encyclopedia
- For the method of progressively displaying a raster graphics, see Interlace (bitmaps).
- For the decorative motif used in Celtic art, see Celtic knot.
Interlace is a method of improving the quality of images displayed on a video imaging device (usually a cathode ray tube) without increasing analog bandwidth. It was invented by RCA engineer Randall Ballard in the late 1920s, and was ubiquitous in television until the 1970s, when video from home computers and videogames reintroduced progressive scan images. Today interlace remains in heavy use for video, being used for all standard definition TV, as well as the popular 1080i HDTV standard. However some argue that digital data compression has made interlace obsolete.
Contents |
Description
The older progressive scan pattern would draw an image on the CRT in a path similar to text on a page - lines left to right, with each line below the previous one until the whole surface area of the tube was scanned. The interlace scan pattern would complete such a scan, (called a field) but then the next set of video scan lines would be drawn within the gaps between the lines of the previous scan. If the two sets of scan lines together were considered a frame, then the even lines had all been scanned in the first pass, followed by all the odd lines. Because of persistence of vision, the two fields would be perceived at the same time, giving the appearance of a full frame. This improved detail compared to simply scanning each field directly over the previous one, and it halved bandwidth compared to scanning the even and odd scan lines all at once in the same period of time.
Application
When motion picture film was developed, it was observed that the movie screen had to be illuminated at a high rate to prevent flicker. The exact rate necessary varies by brightness, with 40 hz being acceptable in dimly lit rooms, while up to 80 hz may be necessary for bright displays that extend into peripheral vision. The film solution was to project each frame of film twice, a movie shot at 24 frames per second would thus illuminate the screen 48 times per second.
But this solution could not be used for television - storing a full video frame and scanning it twice would require using a frame buffer, a method that did not become feasible until the late 1980s. In addition, the limits of vacuum tube technology required that CRTs for TV be scanned at AC line frequency in order to prevent interference. (This was 60 hz in the US, 50 hz Europe.) In the late 1940s when the current analog standards were being set, CRTs could only scan 240 on-screen lines in 1/60th of a second. By using interlace, a pair of 240-line fields became a sharper 480 line frame. While this wasn't as sharp as scanning all 480 lines progressively, it was much more practical.
Odd field Even fieldImage:Progressive scan odd.png Image:Progressive scan even.png Image:Interlace10hz.gif
Interlaced scanning at 60 fields per second normally gives considerably less flicker than progressive scanning at 30 frames per second. Flicker is generally more noticeable on a CRT. Its main disadvantage is a perceptible loss in vertical resolution, which is most obvious when displaying narrow horizontal patterns. If these become less than two lines wide, the odd and even fields will no longer be similar, and the pattern will flicker at one half of the field rate. This effect is called "twitter".
One common misconception is that an odd and even pair of fields represent a single frame. In fact, the camera scans using the same pattern as the CRT, reading even lines from its sensor only after it has finished reading the odd lines. Thus, in a 50 fields per second system, lines 122 and 124 are read approximately one fiftieth of a second after lines 123 and 125 were read. If the odd and even fields were simply combined into a single progressive frame, any parts with horizontal motion would display visible "combing" on their edges.
The alternatives to interlacing (assuming a directly-driven CRT display) are:
- Doubling the bandwidth used and transmitting full frames instead of each field. This produces little improvement in picture quality, since the effective resolution and flicker rate are the same.
- Using the same bandwidth, but transmitting progressive frames with half the amount of detail. The flicker rate remains the same.
- Using the same bandwidth, but transmitting a full progressive frame instead of every two fields. The eye suffers more fatigue (eye-strain) than when viewing the interlaced display, because the flicker rate halves.
- As above, but using a digital frame buffer to display each frame twice. This provides the same flicker rate as the interlaced signal, but with less smooth motion.
In modern monitors and television sets, interlacing is being slowly superseded as the refresh rate of non-interlaced CRTs increases beyond the level at which flicker can be detected. Non-scanning display devices such as LCDs and plasma screens are also becoming common. To display an interlaced signal on any non-interlaced device without "combing", a computationally expensive deinterlacing process is required.
Technical details
In an interlaced system, lines are drawn at a very slight diagonal slope such that the right end of each line is two lines lower than the left end. The offset between the two fields is then produced by having both an odd number of overall lines and vertical flyback between the odd and even fields occur halfway through one line. For example, in PAL, the blanking period starts after 292.5 lines of the odd field have been transmitted, and lasts for 20 lines. When scanning begins again at the top of the screen, the scanning beam is still halfway across the picture. Because of the slant, the centre top of the picture is one line above the line begun at the top left corner.
Interlacing is used by all the analogue TV broadcast systems in current use:
- PAL: 50 fields per second, 625 lines, odd field drawn first
- SECAM: 50 fields per second, 625 lines
- NTSC: 59.94 fields per second, 525 lines, even field drawn first
Interlacing as a data compression technique
When discussing the television technology, it is important to note that it is used today to transmit two fundamentally different kinds of moving images: the so called film-based material, where the image of the scene is captured by camera 24 frames a second, and the video-based material, where the image is captured 50 or ~60 frames a second. The 50 and 60 Hz material captures motion very well, and it looks very fluid on the screen. The 24 Hz material, in principle, captures motion satisfactory, however because it is usually displayed at least at twice the capture rate in cinema and on CRT TV (to avoid flicker), it is not considered to be capable of transmitting "fluid" motion. It is nevertheless continued to be used for filming movies due to the unique artistic impression arising exactly from the slow image change rate.
25 Hz material for all practical purposes looks and feels the same as 24 Hz material. 30 Hz material is in the middle between 24 and 50 Hz material in terms of "fluidity" of the motion it captures, however it is also handled in TV systems similarly to 24 Hz material (i.e. displayed at least twice the capture rate). For further discussion of frame rates, see Moving image formats.
This section deals specifically with the transmission of 50/60 Hz material, also called video-based material from the camera to the screen. Transmission of 24/25/30 Hz material has been described above.
For the video-based material, the interlacing is, in fact, an archaic lossy perceptual image compression technique. Half the lines are dropped from each captured full frame by the compressor, thus achieving a fixed 2:1 compression ratio. This technique exploits the persistence of vision property of human perception, so that the eye and brain together act as the decompressor.
Interlacing as compression can be applied both in analog and in digital domain.
For the purpose of reducing the bandwidth necessary to transmit the video-based material, interlacing is inferior to the modern digital block-based compression techniques for the following reasons:
- Interlacing performs poorly on moving images. Moving sharp objects suffer especially.
- Any decompression done in the display (as opposed to perceptually by the viewer) is computationally very expensive, particularly when tasked with avoiding or suppressing noticeable artifacts. Simple and widely used line-doubling decompression techniques produce artifacts that some viewers consider objectionable. All modern display technologies, apart from CRT, require uncompressed progressive images.
- Video material suffers terribly when edited, or, alternatively, the selection of effects which do not lead to quality loss on recompression is very restricted.
- Progressive MPEG is flexible and adaptive about which details of the image it compresses and how much, while the compression by interlacing does not discriminate about the perceptual complexity of the element of the image being compressed. Moreover, the quality of consumer-grade deinterlacers vary a lot, while the MPEG decoder is absolutely deterministic in regard to the decompression of the progressively compressed stream.
Interlacing can be, and unfortunately is combined with other compression techniques in the digital domain. The combination of interlacing with block-based compression technique inherits all the drawbacks of the interlacing, while also reducing the efficiency of block-based compression. Because interlacing samples every other line without prefiltering, it increases the amount of high-frequency components in the signal fed to the block transformation. This leads to lower efficiency of block transformation (i.e. DCT), or alternatively increases the amount of artifacts after decompression. This also decreases the effectivness of the motion compensation technique, used in the interframe compression formats like MPEG.
When vertical color compression (also called decimation or color subsampling) is included to the combined compression system, it is further effectively compressed by the interlacing. And vertical color subsampling is almost always included into digital and analog television systems (with the exception of broadcast NTSC, and [subject of controversy] broadcast PAL), all over the world. Thus with 4:2:0 color compression technique (i.e. half horizontal and half vertical resolution) the vertical colour resolution drops from 1:2 to 1:4, and overall color resolution from 1:4 to 1:8.
It is sometimes claimed that combining MPEG compression with interlacing reduces the amount of processing power required from the MPEG decoder almost in half. However, this argument does not stand when faced with the immense processing power needed for unobjectionable deinterlacing of the image after MPEG decompressor; and all modern displays but the (gradually disappearing) CRTs require progressive image as its input.
Another argument is that combining interlacing with MPEG drives up the overall sweet spot of the compression system. (Note though, that the sweet spot does not get close to doubling, due to the inefficiencies described above.) Specifically, it makes it possible to transmit 1920x1080 60 Hz video over the broadcasting bit pipe chosen for the ATSC system. However, essentially the same effect on the sweet spot without the drawbacks of interlacing could be achieved by simply prefiltering high frequencies out before applying a progressive MPEG compression; or, less efficiently, by filtering out high-frequency components from the compressed MPEG stream right before injecting it into the broadcasting pipe. On the other hand, most DVB flavours (T, S) offer a suitable bit pipe already today, and a better terrestrial broadcasting technology could have been selected for ATSC too.
Yet another argument is the concern about the 2 times higher technological complexity of the camera and production equipment in case of progressive video, due to twice the uncompressed bitrate at the moment of capture, and twice the computational power needed for processing and compression. This argument will become less relevant in the future with the progress of science and technology, and the TV producers of today or tomorrow may not be very sensitive to the capital costs. However, with interlacing still in use today, many works of art and television recordings of important events will continue utilizing it.
Despite arguments against it and the calls by many prominent technological companies, such as Microsoft, to leave interlacing to history, interlacing maintains a strong grip on the television standard setting bodies, still being included in new digital video transmission formats, such as DV, DVB (including its HD modifications), and ATSC for the purpose of compressing video-based material.
See also
- Progressive scan: the opposite of interlacing; the image is displayed line by line.
- Deinterlacing: converting an interlaced video signal into a non-interlaced one
- Telecine: a method for converting film frame rates to television frame rates using interlacing
- Federal Standard 1037C: defines Interlaced scanning
- Interleaving
External links
- 100FPS.COM* - Video Interlacing/Deinterlacing
- Stream Interlace and Deinterlace (planetmath.org)de:Zeilensprungverfahren
es:Exploración entrelazada fr:Entrelacement ko:비월 주사 방식 pl:Przeplot fi:Lomitettu kuva zh:隔行扫描