A few months ago, I attended the Northern California Chapter of the Digital Cinema Society -- a meeting of professional cinematographers to discuss the latest and greatest technology and tools available. The featured presentations were mostly about high-definition video production, and it was fun. There were several interesting video clips shown, and a lot of discussion about what it took to make them. In one of the behind-the-scenes shots, the subject of filtering Infrared came up, and it struck me that there's no reason for that to be just a movie problem -- it also applies to still photography.
In a nutshell, the problem is this. The sensor technology that sits inside all of our digital cameras is intrinsically sensitive to light we can't see -- especially the light that's redder than red, or infrared. Of course, manufacturers put filters in to make the sensors see things the way we do, but they can't get rid of all of the infrared sensitivity. The result is that our cameras still sense infrared, and what's worse, the red, green and blue sensors often all sense some infrared. That means that a light which is very strong in infrared light (like most tungsten lights are) can lead to very muddy images like the one shown on the right here.
I wanted to learn more about this, so I pinged my most knowledgeable color-science friend, Ricardo Motta. Ricardo is the Chief Technology Officer and Vice President, Imaging Systems at Pixim. I asked Ricardo if he knew how to test a camera's IR sensitivity, and then, how to filter if needed to improve it's rejection of unwanted Infrared. Apparently, I picked the right guy, because he immediately offered to show me how he does both of these things.
The good news is that Ricardo knows how to test cameras. The bad news is that it takes a lot of well thought through high-end electro-optical equipment and precise technique to do it well. His setup at Pixim is very impressive, and possibly the best in Silicon Valley. Unfortunately, it doesn't do much for the average photographer. If you want a quick check though, here's a tip. Grab your digital camera and one of your consumer electronics remote controls. Point the remote control at the camera, and press one of its buttons. If your camera still senses some residual infrared, you'll see it on the camera's preview, or playback display. Here are some pictures with the remote control on and off.
So, you do the test, and find out your camera "sees" IR just like the rest of ours -- is it the end of the world? In a word, no. First of all, remote controls are fairly bright IR sources, and most of the time, there's not that much IR to filter. Second, if you start to notice that you're getting really muddy shots and you suspect that there's a lot of IR in the light, then you can pickup an IR filter to go over your lens. According to Ricardo, the best is a Blue Glass BG39. I won't go into the details here, but the short summary is that it rejects IR better than the other choices, and even though it's not as color-neutral, when combined with the in-camera white-balance, it delivers more vivid colors.
Finally, if you want to see things as really "glass-half-full," you can take advantage of the IR sensitivity of your camera by putting red or infrared-pass filters over your lens, and experiment with infrared photography. Be ready to put the camera on a tripod though, because you will have lost a lot of sensitivity by the time you filter out most of the visible light. It's a fun thing to experiment with though, and you can get some really pretty results like this shot that I took last month in Zion National Park. Needless to say, the sky wasn't really black, and the rocks weren't really white, but when you crank up the red sensitivity, the blue sky goes dark, and the red rocks go white.
Thursday, November 20, 2008
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment