• Please use real names.

    Greetings to all who have registered to OPF and those guests taking a look around. Please use real names. Registrations with fictitious names will not be processed. REAL NAMES ONLY will be processed

    Firstname Lastname

    Register

    We are a courteous and supportive community. No need to hide behind an alia. If you have a genuine need for privacy/secrecy then let me know!
  • Welcome to the new site. Here's a thread about the update where you can post your feedback, ask questions or spot those nasty bugs!

Sensor size in the form "1/1.8 inches"

Doug Kerr

Well-known member
We often see, in connection with smaller-sensor cameras, the sensor size described as, for example:

1/1.8 inch

What does that mean?

This notation follows a repulsive and tortured convention which I will describe shortly.

Under that convention, when we see a sensor size described as:

1/D inches ["D" is evocative of "denominator"]

it means that the diagonal dimension of the sensor is approximately 16.5/D mm.

Thus, the camera mentioned above would be expected to have a sensor with a diagonal dimension of about 9.2 mm (perhaps it would be 7.4 x 5.5 mm in size).

Where on Earth does that come from? Here's the history.

The vidicon video pickup tube was a breakthrough in video cameras in that it allowed the construction of video cameras with a small size and a relatively-simple pickup tube supporting infrastructure. Such cameras were initially hailed as important in the industrial and security fields.

One of the important early vidicon tubes had a target (what we would today call a sensor) 8.8 x 6.6 mm in size. This was of course an analog sensor. It was housed in a glass envelope approximately 2/3" in diameter. We see such a vidicon tube here:

800px-Vidicon_tube.jpg

Vidicon tube ("2/3 inch" size)
from Wikipedia Commons - used under terms of the GNU Free Documentation License

Soon emerged another vidicon with a target about 50% larger, about 12.8 x 9.6 mm. It was housed in a glass envelope approximately 1" in diameter. These came into use in cameras that were typically a bit larger than the other cameras, and afforded improved performance.

Most of the technicians working these cameras were not often aware of the actual target dimensions of the vidicon tubes in them. To them,, the most obvious difference between the smaller cameras and the larger was the physical size of the vidicon tubes.

In much the same way that the size of TV receivers and video monitors was described in terms of the nominal overall diameter of the glass envelope of the cathode ray tubes they used for display, it became common to speak of the two categories of "industrial" TV cameras as the “2/3 inch tube” size and the “1 inch tube” size.

This form of notation for video camera target sizes came to be called the optical format convention. (Sometimes there the constant was 16 rather than 16.5., using the "1 inch" vidicon tube as the benchmark.)

When digital still photography emerged in the consumer realm, the most common cameras had sensors which were quite small - perhaps as small as 4.0 x 3.0 mm (0.157 x 0.118 inch, a diagonal size of 0.196 inch), much smaller than even the smallest still camera film frame in any regular use. The manufacturers were concerned that if the consumers realized this, they would think that these cameras were just photographic toys. Thus, they adopted a ploy that would allow them to state the sensor size in their specifications but not in a way that the consumer could readily interpret.

So they borrowed the “system” used for vidicon tubes, where an 8.8 x 6.6 mm sensor was spoken of as the “2/3 inch” size (0.67"). By interpolating that convention, they felt that they would be justified in describing a 4.0 x 3.0 mm sensor as the “0.303 inch” size (1.55 times the actual sensor dimension).

That certainly sounded better than “0.196 inch”, but not much. They needed to improve their subterfuge.

Well, of course, in the “vidicon” convention the "size" of the smaller tube was expressed as a fraction (2/3), not as a decimal. So, inspired by that, and arbitrarily forcing the numerator to 1, the manufacturers replaced "0.303 inch" (for example) with “1/3.3 inch”. "3.3" certainly sounded better than "0.196".

Of course, once they did that, as the cameras gained larger and larger sensors, the "number" (that is, the denominator) grew smaller. And they explained that by pointing out that, as the denominator got smaller, the number got larger. They just didn't want anybody to see that number, even though it was 1.55 times bigger than the actual sensor dimension).

Thus, we often see digital cameras spoken of as having “1.2.8 inch” or “1/1.8 inch” sensors.

Tracking this back to the vidicon convention, in which a "2/3 inch" tube could be said to be a "1/1.5 inch" tube, we get the relationship I mentioned at the outset, which I will restate here:

When we see a sensor size described as:

1/N"

it means that the diagonal dimension of the sensor is approximately 16.5/N mm.
Trying this out on the ancestral godfather of this convention, the original small vidicon, we find this:

• Size notation: 2/3 inch

• Converted to the "denominator" convention: 1/1.5 inch (D=1.5)

• 16.5/1.5 = 11 mm

That is in fact the diagonal dimension of an 8.8 x 6.6 mm sensor - the size of the target in a "2/3 inch" vidicon. So the formula seems to work out all right.

Thus, when we see it said of a camera that its sensor is "1/2.8 inch" in size, we might expect the sensor to have a diagonal dimension of about 5.7 mm.

Now, in other realms, under a different and almost as repulsive convention, that might be described as a 7.6x sensor!

Revolting, isn't it!

Best regards,

Doug
 
Top