Doug Kerr
Well-known member
In connection with modern video technology, we have a new wave of encounters with the terms "interlaced scan" and "progressive scan". Not surprisingly, there can be considerable confusion arising in this regard. I thought I would speak a little about the concept involved and some of the notation we currently encounter.
Two separate issues
First, let me alert the reader to the distinction between these two matters:
• The format in which a signal is presented in storage or in transmission.
• The operation of the display mechanism.
In the basic analog broadcast television format, and with traditional (normally CRT-based) receiver displays, these were the same thing. That is, the signal was developed in real time in the camera (by a raster scan process), transmitted on-the-fly, and the received video signal was laid on the screen in real time with the same process.
Flicker
Suppose we were trying to establish a transmission format for broadcast television (as the NTSC was doing in about 1940). Suppose we had decided to use a certain number of lines in our image and a certain frame repetition rate (these having been chosen in the light of bandwidth considerations). Suppose the frame rate chosen was 30 frames/s.
Firstly, we often hear that the interlaced format (which I will describe shortly) was developed because a frame rate of 30 frames/s, used in a straightforward way, was not sufficient to give the human eye a suitable "temporal resolution" with regard to motion. That is not so. Motion pictures, for example, had for many years operated at a frame rate of 24 frames/s, which gave perfectly satisfactory motion resolution.
Rather, the issue was that, if we laid out the image on the screen using the obvious form of raster scan, painting a frame every 1/30 second, and considering that the decay rate of the phosphors had to be short enough to not "blur" motion, the human eye would perceive flicker from the "pulsation" of the overall luminance at any region of the screen.
(The same was true of motion pictures; the eye would perceive flicker at a frame rate of 24 frames/s. So in fact the projected image was interrupted at a rate of 48 time/sec by a rotating shutter. Thus each frame was projected once, the screen briefly blanked, and then projected again. Thus the pulsation of the overall light on the screen occurred at a rate of 48 Hz, which the eye would tolerate.)
Now, although improvement of the perceived motion resolution was not the driver for the adoption of the interlaced format, it turns out that, in many case, an improvement in motion resolution does occur. (In effect, during motion, we have a half-vertical-resolution image at double the "frame" rate.) But there can also be some unhappy artifacts of this. A discussion of this is beyond the scope of this note.
That approach would not work in the broadcast TV setting. For one thing, it would require that at the receiver, each frame be held, unchanging, on the screen for 1/30 second, and then the visible output of the screen would be blanked twice for each frame.
The interlaced format concept
Rather, the concept of interlaced scan was adopted. Here, over a period of 1/60 second, the camera scanned, sequentially, all the odd-numbered lines of the raster. These were sent out in real time. At the receiver, these lines were laid down (in real time) at a pitch of twice the image line pitch (essentially creating all the odd-numbered lines of the image). Then, in the next 1/60 second, the camera scanned all the even-numbered lines of the image, which were transmitted in real time. At the receiver, these were laid down at a pitch of twice the image line pitch, displaced by one line from the first pass, thus completing the image.
One subset of lines (odd or even) was called a field, while the entire set of lines was called a frame.
Thus the frame rate of the original NTSC format (before color) was 30 frames/sec; the field rate was 60 fields/sec.
Non-CRT television displays
In the modern era, in many non-CRT television display systems (even in the context of analog TV), an entire frame is built up in a frame buffer and then loaded into the display mechanism itself. That loading may be done in a "scan" fashion (rather than simultaneously), but the organization of that may not follow the field structure. (This is reminiscent of the various ways a digital camera sensor can be read out, or a digital instrument panel in a car driven.)
Progressive Scan
The term progressive scan implies that, in transmission, and/or in the working of a display mechanism (recall that these are separate issues), all the lines of a frame are laid down sequentially.
Digital High-Definition TV
The ATSC broadcast television standards (covering digital TV broadcast) make provisions for numerous transmission formats. These have a range of pixel dimensions, include transmission organized under both the progressive scan and interlaced scan paradigms (such as these appear in the context of digital video representations, such as the various MPEG forms), and different frame rates.
A popular transmission format, and the "highest" one commonly used today, is often referred to as just "1080i", implying 1080 lines, interlaced format. It is normally used at a frame rate of 30 frames/s, and, pertinent since it is an "interlaced" format, this implies a field rate of 60 fields/s. Sometimes this is shown as "1080i60" (in American practice, it is the field rate that is cited; for a non interlaced format, such as "720p", that is identical to the frame rate, but for an interlaced format, it is twice the frame rate). In European practice, it is the frame rate that is always cited, with a slight difference in presentation format as a cue to that. Thus the format above would be called, in the European convention, "1080i/30". (We will use that convention here for most technical descriptions.)
"1080i" television receivers
We often see a certain TV receiver advertised as "1080i". What does that mean? Usually it means that the receiver has a native display mechanism resolution of 1080 lines, and that the receiver can receive ATSC broadcasts in the 1080i format. Assuming that we are not speaking of a CRT-based display, The "i" does not mean that the display operates on an interlaced basis; modern TV receiver display chains always have a full frame buffer, and load that full frame into the display mechanism itself in a variety of ways.
1080p transmission
The latest update of the ATSC standard provides for a new transmission format (using a MP4 encoding) operating at 1080 lines, progressive format (that is, not interleaved), and a frame rate of 60 frames/sec.
The normal US technical description of this would be 1080p60 (somewhat ambiguous). The less-ambiguous European notation would be 1080p/60. A common "marketing" description of this format is "1080p" (totally ambiguous).
"1080p" TV receivers
We often read of TV receivers that are said to be "1080p". What does that mean?
Recall that this does not mean that their display mechanisms are "progressive scan". This is essentially meaningless for non-CRT TV receivers.
Normally, that designation means that they are prepared to deal with what can be called (using the unambiguous European notation) a "1080p/60" signal: 1080 lines of resolution, transmitted non-interlaced, 60 frames/sec. It is the 60 frame/sec display capability that is of importance in such a receiver; the receivers spoken of a "1080i" are usually only capable of displaying 30 frames (1080 lines each) per second.
The intent is that such receivers will be capable of exploiting a 1080p/60 signal, since it has been expected that such will be transmitted in the future as a "better motion" format, perhaps ideal for sporting events.
So, now that a 1080p/60 format has been included in the ATSC standard, and it is expected that transmission in this format will start to happen, will these sets be able to receive and display that?
For most such sets available at the present time, no. How can that be?
Because they have only MPEG-2 decoders, and MPEG-4 (H.264) encoding is prescribed by the ATSC standard for the new 1080p/60 format.
It is of course expected that new high-end TV receivers will provide for the ATSC 1080p/60 format.
Best regards,
Doug
Two separate issues
First, let me alert the reader to the distinction between these two matters:
• The format in which a signal is presented in storage or in transmission.
• The operation of the display mechanism.
In the basic analog broadcast television format, and with traditional (normally CRT-based) receiver displays, these were the same thing. That is, the signal was developed in real time in the camera (by a raster scan process), transmitted on-the-fly, and the received video signal was laid on the screen in real time with the same process.
Flicker
Suppose we were trying to establish a transmission format for broadcast television (as the NTSC was doing in about 1940). Suppose we had decided to use a certain number of lines in our image and a certain frame repetition rate (these having been chosen in the light of bandwidth considerations). Suppose the frame rate chosen was 30 frames/s.
Firstly, we often hear that the interlaced format (which I will describe shortly) was developed because a frame rate of 30 frames/s, used in a straightforward way, was not sufficient to give the human eye a suitable "temporal resolution" with regard to motion. That is not so. Motion pictures, for example, had for many years operated at a frame rate of 24 frames/s, which gave perfectly satisfactory motion resolution.
Rather, the issue was that, if we laid out the image on the screen using the obvious form of raster scan, painting a frame every 1/30 second, and considering that the decay rate of the phosphors had to be short enough to not "blur" motion, the human eye would perceive flicker from the "pulsation" of the overall luminance at any region of the screen.
(The same was true of motion pictures; the eye would perceive flicker at a frame rate of 24 frames/s. So in fact the projected image was interrupted at a rate of 48 time/sec by a rotating shutter. Thus each frame was projected once, the screen briefly blanked, and then projected again. Thus the pulsation of the overall light on the screen occurred at a rate of 48 Hz, which the eye would tolerate.)
Now, although improvement of the perceived motion resolution was not the driver for the adoption of the interlaced format, it turns out that, in many case, an improvement in motion resolution does occur. (In effect, during motion, we have a half-vertical-resolution image at double the "frame" rate.) But there can also be some unhappy artifacts of this. A discussion of this is beyond the scope of this note.
That approach would not work in the broadcast TV setting. For one thing, it would require that at the receiver, each frame be held, unchanging, on the screen for 1/30 second, and then the visible output of the screen would be blanked twice for each frame.
The interlaced format concept
Rather, the concept of interlaced scan was adopted. Here, over a period of 1/60 second, the camera scanned, sequentially, all the odd-numbered lines of the raster. These were sent out in real time. At the receiver, these lines were laid down (in real time) at a pitch of twice the image line pitch (essentially creating all the odd-numbered lines of the image). Then, in the next 1/60 second, the camera scanned all the even-numbered lines of the image, which were transmitted in real time. At the receiver, these were laid down at a pitch of twice the image line pitch, displaced by one line from the first pass, thus completing the image.
One subset of lines (odd or even) was called a field, while the entire set of lines was called a frame.
Thus the frame rate of the original NTSC format (before color) was 30 frames/sec; the field rate was 60 fields/sec.
Non-CRT television displays
In the modern era, in many non-CRT television display systems (even in the context of analog TV), an entire frame is built up in a frame buffer and then loaded into the display mechanism itself. That loading may be done in a "scan" fashion (rather than simultaneously), but the organization of that may not follow the field structure. (This is reminiscent of the various ways a digital camera sensor can be read out, or a digital instrument panel in a car driven.)
Progressive Scan
The term progressive scan implies that, in transmission, and/or in the working of a display mechanism (recall that these are separate issues), all the lines of a frame are laid down sequentially.
Digital High-Definition TV
The ATSC broadcast television standards (covering digital TV broadcast) make provisions for numerous transmission formats. These have a range of pixel dimensions, include transmission organized under both the progressive scan and interlaced scan paradigms (such as these appear in the context of digital video representations, such as the various MPEG forms), and different frame rates.
A popular transmission format, and the "highest" one commonly used today, is often referred to as just "1080i", implying 1080 lines, interlaced format. It is normally used at a frame rate of 30 frames/s, and, pertinent since it is an "interlaced" format, this implies a field rate of 60 fields/s. Sometimes this is shown as "1080i60" (in American practice, it is the field rate that is cited; for a non interlaced format, such as "720p", that is identical to the frame rate, but for an interlaced format, it is twice the frame rate). In European practice, it is the frame rate that is always cited, with a slight difference in presentation format as a cue to that. Thus the format above would be called, in the European convention, "1080i/30". (We will use that convention here for most technical descriptions.)
"1080i" television receivers
We often see a certain TV receiver advertised as "1080i". What does that mean? Usually it means that the receiver has a native display mechanism resolution of 1080 lines, and that the receiver can receive ATSC broadcasts in the 1080i format. Assuming that we are not speaking of a CRT-based display, The "i" does not mean that the display operates on an interlaced basis; modern TV receiver display chains always have a full frame buffer, and load that full frame into the display mechanism itself in a variety of ways.
1080p transmission
The latest update of the ATSC standard provides for a new transmission format (using a MP4 encoding) operating at 1080 lines, progressive format (that is, not interleaved), and a frame rate of 60 frames/sec.
The normal US technical description of this would be 1080p60 (somewhat ambiguous). The less-ambiguous European notation would be 1080p/60. A common "marketing" description of this format is "1080p" (totally ambiguous).
"1080p" TV receivers
We often read of TV receivers that are said to be "1080p". What does that mean?
Recall that this does not mean that their display mechanisms are "progressive scan". This is essentially meaningless for non-CRT TV receivers.
Normally, that designation means that they are prepared to deal with what can be called (using the unambiguous European notation) a "1080p/60" signal: 1080 lines of resolution, transmitted non-interlaced, 60 frames/sec. It is the 60 frame/sec display capability that is of importance in such a receiver; the receivers spoken of a "1080i" are usually only capable of displaying 30 frames (1080 lines each) per second.
The intent is that such receivers will be capable of exploiting a 1080p/60 signal, since it has been expected that such will be transmitted in the future as a "better motion" format, perhaps ideal for sporting events.
So, now that a 1080p/60 format has been included in the ATSC standard, and it is expected that transmission in this format will start to happen, will these sets be able to receive and display that?
For most such sets available at the present time, no. How can that be?
Because they have only MPEG-2 decoders, and MPEG-4 (H.264) encoding is prescribed by the ATSC standard for the new 1080p/60 format.
It is of course expected that new high-end TV receivers will provide for the ATSC 1080p/60 format.
Best regards,
Doug