Hi, Bart,
Is it possible that the "native sensitivity" of a sensor is considered to be the sensitivity setting at which the maximum digital output of the ADC corresponds to "saturation" of the photodetector itself (the photometric exposure that would cause essentially all of its initial charge to be dissipated)?
That would be a meaningful property of the sensor system.
It would be the lowest workable sensitivity.
Hi Doug,
Indeed, AFAIK there is no formal definition of native sensitivity that is universally adopted. However, since basically all silicon based photovoltaic sensors have a given sensitivity to light but the added features (circuitry, gates, masks, CFA, microlenses) change its quantum efficiency, I adopt the determination of maximum Dynamic range as the determinator of native sensitivity.
As an example, with my 1Ds2 I can maximze DR by setting the ISO (= gain) to ISO 'L', which effectively is approx. ISO 75-80 although the exposure meter assumes ISO 50. The Raw read noise level is lower than at ISO 100, and the saturation level is the same, thus maximum DR (engineering definition) and native sensitivity is ~ISO 80. On my 1Ds3 however, ISO 'L' and ISO 100 both result in virtually identical read noise and saturation levels, so the best DR is at the native sensitivity of ISO 100.
The numbers coincidentically also come close to findings at dxomark.com (ISO 84 and 73 for the 1Ds Mark II and III), for the maximum DR
BTW, the 5D (old model), according to DxO is ISO 92 at maximum DR.
Cheers,
Bart