Color is a beautiful thing. An attraction to color seems to be programmed into us. Every year, people travel significant distances to see the red rock canyons of Utah. In the autumn, large numbers of people travel to see the stunning fall colors on the east coast of the US or in the mountains of the western US states. Along the coast highway in California, people pull over and park their cars whenever a beautiful sunset starts to develop. We photographers spend a good deal of our time concentrating on color: an orange poppy, a yellow butterfly, a lavender field of spring flowers, or a blue and green peacock.
In fact, color seems to be hard wired into not only our perceptual system but our emotional system as well -- cool colors have a calming effect (a deep blue lake) while warm colors have a stimulating effect (the flashing red light on top of a ambulance). Because of this, color becomes a powerful tool in the hands of a skilled photographer. It provides an entryway to peoples' emotions. By properly utilizing color, a photographer can create an emotional reaction in people that view his images. Thus, photographers that understand how to use color produce emotional experiences for their viewers while those photographers that lack this understanding produce snapshots!
The problem is that, while color is such a powerful tool for photographers, it is far from simple. The perception of color, and the intertwined luminosity, is an intricate process that involves light, objects, the human sensory system, and the human brain. A light source produces the colors of light by molecular processes (either by heat or the transfer of electrons between molecular orbits). The object illuminated by this light alters the color of the light due to the physical properties of the object. The eye of an observer gathers the light through four different types of cells in the retina at the back of the eye. Each of these four types of cells has its own sensitivity to light. The sensitivities of these cells make the human eye more sensitive to certain colors than others. This alters what the human observer sees. On the other hand, the sensor in a camera gathers light through pixels. Most cameras have three types of pixels (actually, the pixels are the same, but a filter in front of the pixels makes them behave differently). Each of the three types of pixels is sensitive to a different color of light. Unfortunately, the light sensitivity of these pixels is different than the sensitivities of the cells in the human eye -- creating a different response to color. A monitor, that a photographer uses to view an image while editing, creates color by exciting phosphors (generally three different phosphors) on the screen that then emit their own light or by sending light through colored filters. The light emitted by the phosphors or passed through the filters has its own characteristics. Unfortunately, the light characteristics of the phosphors and filters do not match the characteristics of the light source, the object, the human eye, or the camera. Lastly, a printer creates the print. This print is created via some physical process. In the case of inkjet printers, a number of dyes or pigment inks are used. These dyes or pigments have their own color characteristics. As you might suspect, these color characteristics do not match the color characteristics of any of the other components in the system.
Those that understand how humans perceive color and luminosity can use that information to create dramatic images. Of course, even those with little understanding of the photographic process can get lucky from time to time. I once knew a guy that captured a gorgeous sunset over the ocean. This person knew nothing about the process of color perception, composition, or the challenges of dynamic range that can come into play when photographing sunsets. What he knew was how to press the shutter button. However, it comes down to whether we want to depend on skill or luck in creating memorable images. Those that understand the process of color and luminosity perception are in a position to better employ it to create great photography than those who do not have this understanding.
The objective of this article series is to cover the topics of color and luminosity perception so that one can utilize this knowledge when creating images. Now, before we start, let me give you a warning. This article series is not for the timid. Color and luminosity perception is an intricate process. To understand it, we must delve into those intricacies. Thus, this series is not for those that desire a sixty second explanation of color.
Everything has to start somewhere. The odyssey into the realm of color and luminosity perception starts with light.
In order to understand light and its place in color and luminosity perception, it is necessary to understand the nature of light. From a physics point of view, light is simply one form of electromagnetic radiation. Of course, that definition simply prompts the question, "What is electromagnetic radiation?" From a simple point of view, electromagnetic radiation is just a form of energy that can travel across space. You are actually very familiar with electromagnetic radiation. The radio waves that your stereo picks up are one type of electromagnetic radiation. The TV signal that brings your favorite shows is another type of electromagnetic radiation. The waves that heat up your lunch in the microwave are yet another type. That red beam of light from the laser pointer you used in your last business presentation was also electromagnetic radiation. Lastly, the x-ray images that your dentist took at your dental check-up were produced by electromagnetic radiation. In short, our modern-day life is inundated with different forms of electromagnetic radiation.
So, light is just one form of electromagnetic radiation. This can be seen in Figure 1 which shows the electromagnetic spectrum. This spectrum shows that light is just one among many types of electromagnetic radiation. Of course, to us humans, light holds a very special and important significance. I might not be too concerned if someone told me that the amount of gamma rays had decreased. On the other hand, if I woke up to discover that the light from the sun had disappeared and the world had been plunged into darkness, I would be a tad upset. So, what is it that differentiates light from the other forms of electromagnetic radiation? For that answer, we need to delve a bit deeper into the nature of light.
Actually, figuring out the nature of light was not an easy task. In fact, the riddle of the nature of light was studied and debated for many hundreds of years. In the later years of this debate, the issue revolved around whether light was some kind of wave or some type of particle. The problem was that, in some ways, light behaved like a wave. In other ways, it behaved like a particle. In the 1800s, James Maxwell showed that light is a form of radiation. This radiation has two components: an electrical field and a magnetic field. These two fields vibrate in space similar to a wave in the ocean. Thus, Maxwell showed that light is a wave (an electromagnetic wave)
This was a very important discovery. Of course, by now, some of you are thinking, "Enough of this physics. This has nothing to do with photography". Actually, it is of extreme importance to photographers due to the concept of wavelength -- for it is wavelength that determines the colors that your eyes and a camera see.
Wavelength means the same thing when applied to electromagnetic waves. The only difference is that the waves are made of electromagnetic fields instead of water. This is shown in Figure 3. This figure shows that, when applied to electromagnetic waves, wavelength is the distance between two adjacent crests of the electromagnetic wave.
We can now answer the question, "What is it that differentiates light from the other forms of electromagnetic radiation?" The answer is the wavelength. Light is simply electromagnetic radiation that has a wavelength that our eyes can detect.
Our eyes are not sensitive to all wavelengths of electromagnetic radiation. In fact, our human eyes are only sensitive to a very small band of wavelengths. We can see wavelengths from about 380nm to 700nm. All other wavelengths are invisible to humans. A glance back at Figure 1 shows that the portion of the electromagnetic spectrum that we humans can detect with our eyes is very small. At this point, it is important to note that there is no qualitative difference between light and the other electromagnetic radiations shown in Figure 1 except the wavelength. The only thing that differentiates light from the other electromagnetic radiations is that the light sensitive cells at the back of our eyes have pigments that can sense wavelengths between 380nm and 700nm (this will be covered in more detail in a later article in this series). In fact, certain animals can detect wavelengths that we can not detect. For instance, some snakes can detect infrared electromagnetic radiation that we can not see. That is why a poisonous snake may strike a person at night even though it was too dark for the person to see the snake. The snake was able to detect the person by infrared radiation.
Within the range of wavelengths that the human eye can detect, the eye and brain associate different colors with different wavelengths. This is illustrated in Figure 4. This figure shows the spectrum of visible light. The spectrum starts on the left with a wavelength of about 380 nm. This is about the shortest wavelength that the eye can see. The eye sees this wavelength as violet. As we move across the spectrum to the right, the wavelengths become longer and the colors that the eye sees change. At a longer wavelength of around 475 nms, the eye sees blue. At yet an even longer wavelength of about 570nms, the eye sees yellow. At around 650nms, red is seen. Thus, wavelength and color are inseparable -- the eye determines color by the wavelength of the light.
When working with light, it is often important to be able to show the colors (wavelengths) that are emanated from a light source. This can be done with a spectral curve. A spectral curve shows the amount of each color that light contains.
One thing that should be noted is that the lights represented by Figures 6 and 7 have heavy color casts. The only light that is perfectly neutral in tone is white light. Figure 8 shows how the spectral curve of a perfect white light would appear. This light has equal amounts of all of the wavelengths. Thus, the spectral curve is a straight line. However, in nature, it is unlikely that you will ever run into a perfect white light. Instead, you'll more likely be working with light like the daylight shown in Figure 9. This daylight clearly contains many wavelengths but not in equal amounts. Thus, this light will not be perfectly neutral toned.
While Maxwell showed that light was composed of electromagnetic waves, the particle like behavior of light continued to puzzle scientists for many years. It took the development of quantum physics to solve the riddle. Quantum revealed that light is actually composed of incredibly tiny packets of light. These packets are called photons (some people visualize photons as raindrops of light). Each photon has a wavelength (and thus a color). Therefore, what appears to be a beam of light is actually composed of an unbelievable number of tiny photons.
Now, photons may seem like something of far more importance to a physicist than a photographer, but nothing could be further from the truth. Photons are very important to photographers because they determine the other primary characteristic of light: intensity. The more photons there are, the more intense the light will be. Thus, as the number of photons hitting our eyes increases, the light is perceived to become more intense. Also, the more photons that hit the sensor of a digital camera, the more exposure the sensor receives. In a sense, the exposure of a digital camera is determined by counting the number of photons that hit each pixel. Actually, the camera doesn't directly count the photons. Rather, it measures an electrical charge that is produced when the photons cause a current to flow in the sensor. Once the electrical charge has been determined, the amount of light (i.e., photons) that hit the sensor can be calculated.
So, scientists have provided photographers with two very important pieces of information:
And you thought physics was just for eggheads!
This is a very important concept for photographers because the colors that we capture in our images are heavily dependent on the colors (i.e., the spectral curve) of the light that illuminates the objects that we photograph. We have all seen a red apple. The apple is red partly because the light that illuminates it contains red wavelengths (it also depends on the physical properties of the apple). If the light does not contain red wavelengths, the apple will not appear red. Sunsets are another example. Sunsets are beautiful because the light contains primarily longer wavelengths (i.e., the warmer colors). If a photographer wants to maximize the use of light, he must become aware of the colors that the light contains -- in other words, the spectral curve of the light.