What is gamma correction in image processing

What is gamma and gamma-correction? What is it for? Why should I care about it?

These are the questions you may have already met, if you once made an offer to obtain the correct graphics display, as a simplest example - on your own monitor.

Gamma is the power value of the nonlinear intensity reproduction. Not a very clear enough, isn't it? Rather, it is not understandable at all :) So let's begin from the start, from where the gamma takes the main part. It is the gamma-correction.

The gamma-correction is the process of...
encoding and decoding the luminance or tristimulus values. This coding is carried out according to the formula that, in it's general form, looks like:

Vout = Vinγ

where Vout is the received value, Vin is the original value, and γ is our gamma, i.e. the power to which the original value raised to get the received desired one.

A question: where is the nonlinearity in gamma? The fact the exponentiation is a function, after which the raised number and the resulting number have the nonlinear dependence to each other. This feature is clearly seen on the power function graph:

Gamma 2.2 decoding power function graph nonlinearity

As can be seen, the graph showing the relationship is not a straight line. From this the name of the power function, the nonlinear.

So where did the gamma concept appear from at all?

Historically, the word gamma in the recording and displaying systems borrowed from the photography. In the old photography in a nineteenth century, when studying the properties of a photo film and properties of its developing, it was observed that the optical density of a received image is DISproportionate to the film exposure power. In other words, the doubling (twice more) the film exposure time, it becomes darker (lighter) NOT twice as much.

Gamma history HD curve represents film density exposure

The part of the graph displaying the nonlinearity power dependence of the received exposed film opacity in photography called gamma. The more gamma value differ from the 1.0, the more its behavior is nonlinear. Thus, the gamma in photography is the nonlinearity power dependence between the received density and the exposure power.

For the first time the gamma-correction, as such, has been used in imaging systems, namely in television tubes, based on the cathode-ray tube technology. The fact that the brightness of the CRT kinescope displayed light and the voltage applied to its electron-gun have nonlinear dependence, as it is with the film's density and exposure in photography.

The intensity of light received on the CRT screen is proportional to the voltage applied, but raised to a certain power. In simple words, when applied to the kinescope some one unit of the voltage, it turns the one unit of brightness. According to the linear logic, if we apply the two those voltage units to the kinescope, it should return two brightness units. But, it is not quite so. In this case, the same kinescope still returns one unit of brightness, though with a some little raising, let's say one-and-something units. And only when applied even more voltage, the kinescope returns the full two units of brightness. And so on.

Spontaneous crt correction darkened output linear input

This feature of the CRT display required the correction of the applied voltage. The correction was carried out, which was inversely equal to the spontaneously present one in the CRT. That is to get the two brightness units, the more-than-two voltage units were applied considering the non-linearity to make overall system linear.

Gamma correction for compensation crt correction

The imposition of these two inverse nonlinearities gave the linear output and the correct display of the all gradations on the screen in result.

The involuntary question arise:

So what is the use of gamma and gamma-correction, if the CRT screens has long been "out of fashion"?

Quite oddly, but gamma is not a relic of the old or some conspiracy :) It's not even a conservative need of compatibility with "old" information formats and outdated types of monitors.

In fact, gamma correction is an extremely useful thing. It allows us to have a high quality images with limited bit space. The gamma encoding and gamma decoding concepts are very opportunely here. The important fact here is that human perceives lightness also nonlinearly. We distinguish the dark colors better, more bright in some sense, comparing to what they actually are. As a result, for us, the factually linear gradient appears brighter, shifted with own center to a dark side.

Non-linear lightened human perception factual linear gradient

While the factually dark gradient, shifted by its center to a light side, appears ideally linear.

Linear human perception factual darkened gradient

This exact peculiarity creates the transmitting/saving problem when using, of course, the limited information space, which are, for example, in familiar to us raster image files.

Banding issue non corrected limited bit input

Imagine we need to write the all seen by a human eye brightness range in a limited number of information cells. So, on a bright area and on the dark one, we spend the same number of these cells. Given that the human eye perceives the dark area in more detail, and the bright area in less, it turns out that on the more significant dark area we spent the same amount of cells, as on the less significant bright one. This is not rational, and may become apparent as a non-smooth, broken gradient in a dark range, when the bright range space will keep more gradual information than human eye can perceive.

What can be the way out of this problem? Of course, they way out this situation is the nonlinear recording of the visible range to the information cells. That is, to give the bigger share of the cells for recording the dark area and the less share for the bright one. Then we'll have no mentioned display defects. The dark area will have the sufficient information space for acceptable perception of the dark shades, as well as the bright area will have a less amount of information space occupied, but in the same time they also will be enough for acceptable bright tones perception.

That is why the brightness information is much advantageous to be recorded and transmitted exactly nonlinearly, accordingly to the human eye perception peculiarities in a different areas of the visible range.

To use the limited information space effectively, we need to encode the obtained factual brightness values, so they fit to that space with an equal precision from the point of our human perception.

What is surprising is that the human perception nonlinearity of the brightness is almost perfectly inverse to the CRT brightness response!

Right input correction well looking gradient output

This gamma-correction, which is needed to compensate the CRT systems nonlinearity, at the same time is a gamma-encoding of the graphic information in a perceptual domain!

It is an actually remarkable coincidence. If history of imaging devices formed in a different way, without the invention of the CRT technology, it is very likely that man would have had to invent the gamma-coding separately.

Perhaps you have a question: so if the human perception is relatively the inverse to the CRT display, why we do any correction? The answer for this reasonable question may not be obvious, but it is quite simple and lies in the fact that the camera or scanner, which get the original image, perceive the brightness ideally linearly, so we may see the gradients a somewhat brighter with our human nonlinear perception. To have the right tone gradient transmission, we need the correction and, at the same time, the very useful coding :)

Despite the fact that the all modern monitors do not have the cons of the more old CRT technology, they still use the gamma-decoding, what allows to represent the images, which were initially saved gamma-coded for rational use of an information space.

The gamma-correction in video systems performed at the input stage, i.e. while shooting, as a rule by the camera itself, and the image, transmitted over the further process and display, already is gamma-corrected.

In computer generated imagery the same gamma correction is usually performed after the rendering calculations, in the output stage of the so-called frame buffer.

The understanding of the gamma is very important in today's world color transmitting chains and color display systems. Any of those who deals with color, for example in computer graphics or printing, surely will raise their professional level if armed with the such knowledge.

In spite of it's triviality, many think of the gamma correction theory as of something incredibly complex and incomprehensible, but it is not so. You already know it :)