From: Mihai Cartoaje on
I wrote a program to measure the gamma for the computer I bought in 1996,
and it was linear to within experimental error.

There's this from Pennebaker & Mitchell 1993 p.26:

"The capture device is typically the most linear part of the system, in
that the quantum efficiency (the number of electrons emitted for each
photon arriving from the image) is quite constant in a well-designed and
well-calbrated imaging device. (Quantum efficiency is usually dependent
on the color of the light.)

The output device, especially if it is a CRT display, is often quite
nonlinear. A gamma correction is often applied to displays to correct for
the nonlinear characteristics of the electron gun and to make the
visually perceived output fairly linear. In critical applications the
device characteristics should be measured and the input corrected to
achieve a linear response."

Intel graphics chips also have linear gamma as the default setting:
http://www.intel.com/support/graphics/sb/cs-013542.htm

From: Thomas Richter on
Mihai Cartoaje wrote:
> I wrote a program to measure the gamma for the computer I bought in 1996,
> and it was linear to within experimental error.

How do you measure gamma without a colorimeter? Or rather, the gamma of
what? Of the overall graphics card-monitor-eye system? That is *not*
quite the gamma of the monitor. The monitor has a gamma of around 2.2
for CRTs, and LCDs do use a lookup table to match that gamma, even
though the display technology by itself is close to linear.

However, your eye has a logarithmic sensitivity curve which is close,
but not identical to the inverse of the monitor gamma - thus the overall
monitor-eye system is not too far from linear.

> There's this from Pennebaker & Mitchell 1993 p.26:
>
> "The capture device is typically the most linear part of the system, in
> that the quantum efficiency (the number of electrons emitted for each
> photon arriving from the image) is quite constant in a well-designed and
> well-calbrated imaging device. (Quantum efficiency is usually dependent
> on the color of the light.)

That's so far correct to my knowledge.

> The output device, especially if it is a CRT display, is often quite
> nonlinear. A gamma correction is often applied to displays to correct for
> the nonlinear characteristics of the electron gun and to make the
> visually perceived output fairly linear. In critical applications the
> device characteristics should be measured and the input corrected to
> achieve a linear response."

The gamma correction is not really applied to the display to my
knowledge. Instead, the signal is coded and transmitted gamma corrected.
For a regular TV transmission, this correction is applied at the sender
(the TV studio) to my very knowledge. This makes the
camera-gamma-correction-transmission-monitor system linear.

> Intel graphics chips also have linear gamma as the default setting:
> http://www.intel.com/support/graphics/sb/cs-013542.htm

No doubt about it.

Question is (or was): Which primary colors are used for JFIF? Does a
display application (such as photoshop) really implement a REC-601 (JFIF
primaries "as standardized") into sRGB (Rec 709) transformation? I
actually don't know, but I suspect that most if not all programs use the
sRGB primaries and gamma for interpreting JFIF, *not* Rec-601 colors and
gamma as they should.

Greetings,
Thomas

From: Mihai Cartoaje on
The gamma of the visual output compared to the numerical values. I
wrote a svgalib program that displayed two rectangles on the screen,
one on the left and one on the right. One rectangle was an uniform
gray color, which I denote A. The other rectangle had alternating
lines of another gray color, which I denote B, and black. By pressing
the arrow keys, I could change the value of A until both rectangles
looked the same color from far. When the program quit, it displayed my
choice for the value A. I did this test for multiple values of B and
found A to be approximately B/2.

My understanding is that video cards apply a gamma correction that
makes the visual output linear.

If your copy of X.Org is like mine, it changes the video card's gamma
from 1.0 to 2.2, unless there is a Gamma line in the "Monitors"
section.

> However, your eye has a logarithmic sensitivity curve which is close,
> but not identical to the inverse of the monitor gamma - thus the overall
> monitor-eye system is not too far from linear.

This might depend on the opening of the iris. Video compression folks
(mplayer man page) write that the contrast sensitivity is lower for
dim or bright regions.
From: glen herrmannsfeldt on
In comp.compression Mihai Cartoaje <mcartoaje(a)gmail.com> wrote:
(snip)

> My understanding is that video cards apply a gamma correction that
> makes the visual output linear.

I am not so sure what video cards do. Back to the early days of
television, gamma correction was applied to the video signal.
It is easier to do once at the transmitter than in every TV set.

One incompatibility of the NTSC color signal with B&W TVs is
in the gamma correction. It is applied separately to each of
the R, G, and B signals before the matrix to convert to the
signals for the subcarrier. The luminance signal (as seen by
B&W receivers) is right for a white or gray signal, but not quite
right for a brightly colored signal. It was considered close
enough though, but the difference was documented.

>> However, your eye has a logarithmic sensitivity curve which is close,
>> but not identical to the inverse of the monitor gamma - thus the overall
>> monitor-eye system is not too far from linear.

> This might depend on the opening of the iris. Video compression folks
> (mplayer man page) write that the contrast sensitivity is lower for
> dim or bright regions.

It might also depend on how much of the visual field is involved.
Normally your eye can correct for the average color for light
sources that aren't exactly white, but not for only part of
the visual field. The correction doesn't work for slide
projectors or TV pictures.

-- glen