[转载][2010-04-07] Gamma – 2.0, 2.2, or 2.4?

发布时间 2023-07-31 00:42:37作者: 4thirteen2one

BT.709 refers to an international standard, first adopted in 1990, for HDTV.
The standard defines the colours (chromaticities) of the red, green, and blue primaries and the white point (CIE D65).
These specifications are well established and widely used.
How to accommodate wide color gamut is a challenge that might require some changes to BT.709 in the near future, but let's leave that for a later installment.

In addition to specifying chromaticities, BT.709 also purports to specify nonlinear image coding.
I have written extensively elsewhere about this aspect of BT.709.
The short story is that the BT.709 story is wrong.
As written, BT.709 documents camera characteristics, but what is needed is specification of a reference display.
Without a reference display standard, there is no reliable mechanism to establish creative intent.
We need a new standard that specifies characteristics comparable to those of a broadcast video monitor ("BVM").
A classic BVM has a gamma of about 2.4: Input R'G'B' signals from 0 to 100 units are scaled by 1/100 , raised to the power 2.4, then scaled to the absolute luminance established for white.

If you are a home theatre calibrator, you might at this point be saying: "But I align my customers’ displays to gamma of 2.2, not 2.4! Am I doing it wrong?" Well, perhaps not, but some explanation is in order.

HD programming was historically approved on "BVMs" – B, V, and M are the first three letters of the part numbers of a series of Sony displays: A Hollywood studio might routinely master on a Sony BVM-D32E1WU.
Such a reference display would historically produce white luminance of 100 nt.
Twenty years ago, content would have been approved in an ambient illuminance of perhaps 6 lx with a “dim” surround of perhaps 10% of reference white.
Today, final approval is done in very dark conditions, with ambient illuminance 1 lx or less, and surround luminance perhaps just 1% of white luminance.

Colors change appearance depending upon absolute luminance, and upon their surroundings.
A very dark surround at mastering will "suck" color out of a presentation previously viewed in a light surround.
A colorist will dial-in an increase in colorfulness (for example, by increasing chroma gain).
The intended appearance for an HD master is obtained through a 2.4-power function, to a display having reference white at 100 nt, with 1 lx ambient, and 1% surround – but that
appearance will not be faithfully presented in different conditions!

The key point concerning the consumer's gamma is this: What we seek to maintain at presentation is the appearance of the colors at program approval, not necessarily the physical stimuli.
If the consumer's display and viewing conditions differ from those at mastering, we may need to alter the image data to preserve
appearance.

In a home theater environment, you might set the consumer's display to 100 nt, matching the approval luminance.
However, ambient conditions in a consumer environment – even a rather dark home theater – are somewhat lighter than typically used for mastering today.
The lighter conditions cause a modest increase in contrast and colorfulness, beyond that witnessed at content creation.

If the power function on R'G'B' – display gamma – is dialed back a little, that contrast and colorfulness are reduced.
At about 300 nt, with ambient illuminance of 5 or 10 lx, and with a surround of say 5%, decreasing gamma from 2.4 to 2.2 will visually compensate the effect.
So, if your consumer has such an environment, I recommend gamma of 2.2.
If your customer preferred to display the same imagery at 48 nt, in darkness (zero ambient illuminance), in a 0% surround, then gamma of 2.6 (as in digital cinema) might be appropriate.
In a really, really bright environment, or with a really bright display (say 400 nt or 500 nt), decreasing gamma to 2.0 might be appropriate.

It is another issue how ten or twenty measurements of the greyscale curves can be distilled down to a single gamma number.
I'm not enthusiastic about the EBU recommendation for the calculation: EBU Tech. 3325 (on page 34) calls for subtraction of the base luminance Lmin; subtracting that bias is a mistake in my view.
In the EBU document – and in most home theatre calibration packages – all of the measurements are effectively normalized by the luminance at reference white; however, that normalization gives the 100% measurement undue weight: That particular measurement is very likely to exhibit some saturation droop.
On the other hand, we can’t complain too much about the EBU technique, because smpte and ITU fail to give any guidance on computing effective gamma.
I'll save the remainder of this argument for a future piece.

As always, your comments are welcome.


原文链接: Gamma_2.0_2.2or2.4?

EBU (European Broadcasting Union)

METHODS FOR THE MEASUREMENT OF THE PERFORMANCE OF STUDIO MONITORS


本文大概对 gamma 的不同取值 2.0、2.2、2.4 及其适用场景做了简要解释:

  1. 显示器 gamma 的取值应该使显示器显示的颜色状态尽量能还原视频内容效果评审时所展示的颜色状态
  2. 视频内容效果评审时往往在极暗环境下在 100nit 标准监视器上进行的
  3. 普通消费者观看视频内容时所处的环境亮度往往达不到视频内容效果评审时那么暗,以及他们所使用的显示器亮度往往高于 100nit
  4. 这样消费者看到的视频效果,就往往不能还原创作者调色评审时的效果,因为颜色会更亮
  5. 所以大概这样子: 48nit, gamma2.6; 100nit, gamma2.4; 300nit, gamma2.2; 500nit, gamma2.0