A common question I get is about lighting and why one light is more desirable than another, so I thought I would take this opportunity to answer that question in more depth.
The first thing to address is the characteristic of certain lights. Let's start with the classic orange finger scorching tungsten incandescent bulb. This is essentially a fancy heating element that happens to put out *some* light. I stole of a picture of an incandescent bulb spectrum from here. For those less familiar with wavelength versus color, the 400-500nm range is generally violet-blue, The 550-650nm range is generally green-yellow-ish. The 650nm and above range is generally orange-red-ish. So looking at the graph below, violet is on the far left, green in the middle, red on the far right. As you can see a tungsten bulb is generally missing blue and is lacking in green, leaving more yellow-red in the spectrum. Typically a tungsten video light is rated at 3200K and a household bulb might be 2700K - lacking even more blue and green.
Next, we can explore a fluorescent bulb. These bulbs come in a variety of color temperatures, however for video we generally stick with a color temperature of 5500k for reasons I'll get to later. For this comparison I stole a picture from Kino-Flo here. A Fluorescent bulb generally puts out UV light that is then translated to a wider spectrum by coating the glass bulb with a phosphorescent substance. In the spectrum charts below you can see the UV spike down around 420nm. Anyone that's been to a science museum is familiar with the green glow of phosphorus, thus the green spike around 550nm. Those characteristics are fairly independent of the rated color temperature of the bulb. They just use different amounts of phosphorescent substances to change the perceived blue-orange ratio of the bulb to get somewhat close to a color temperature rating. A lot of the time I put a gel over fluorescent fixtures to trying to calm the green spike behavior a little.
Obviously the hottest trend on the market is LED lighting. They aren't quite the panacea that everyone thinks they are, but they do offer quite a few advantages in power, portability, and general usage. Every "white" LED is actually a blue LED that has an optical coating that broadens the spectrum to appear "white", just as a fluorescent bulb does. So the result is that you see a blue spike in the spectrum, with a lump in the rest of the spectrum where the phosphorescent material emits light. I stole a picture from the Philips Lumileds site here. So, just like a fluorescent bulb, a LED can be made to appear to be just about any color temperature by changing the coating.
The real question is, why does this matter? Well... our cameras are designed for broad spectrum daylight. Typically the "native" color temperatures of CMOS sensors are in the range of 5000K-5600K, making daylight range color temperatures more desirable to use with CMOS sensors.
First, I think I need to define what I mean by "native." The color filters on the image sensor itself have a certain balance of red/green/blue. If, for instance, the native color temperature of the sensor is 5200K and you have a light source that is perfectly 5200K the white balance of the sensor will be dead-on. There won't be a need to adjust the ratio of red to green to blue because they are already balanced by the sensor's color filters. If you use the same sensor under a tungsten bulb (3200K) then the camera's processing circuitry has to greatly boost the blue channel, thus creating more noise in blue.
From reports I've read the "native" color temperature of the Canon 5D mark II sensor is about 5200K. The Arri site lists the native color temperature of the Alexa CMOS sensor at 5600K. Jim Jannard rates the native color temperature of the Red MX sensor at 5000K. There's nothing inherently wrong with any of these ratings. They are what they are. I used a mix of tungsten, fluorescent, and LED this last weekend on a shoot with the 5D mark II and I think it came out fine at a low ISO setting.
The other reason the spectrum of these lights matter is that it effectively changes the color rendering of camera, even when custom white balanced to the particular light source. This is especially true with a very "spikey" spectrum such as a fluorescent bulb or LED. Maybe I'll save that for a future blog entry when I'm feeling techie again. For now I'll just point to the Wikipedia entry on Color Rendering Index (CRI). This is where a color test chart, such as the DSC Labs charts, come into play.
HOW WE DID IT: Doc Style
2 years ago