Thursday, January 21, 2010

Fuzzy Filters

Much is made of image quality when no low pass filter is used in front of a digital sensor - as per Leica, medium format backs, and Foveon sensors. It's always been assumed that once pixel counts got high enough, the fuzzy filter which blurs the image so patterns like cloth don't create moire patterns in the Bayer algorithm interpretation of the sensor data would no longer be necessary. Despite this the Nikon D3X has one at 24.5 megapixels, the Canon 5D2 at 21.5 megapixels has one, so it isn't clear when we are going to be able to drop it. People talk about better software solutions for removing moire but the reality is it is darn hard to know what is a repetitive pattern and what is an artificial effect of the interpolation.

In reality, the problem isn't in the math, it is in the regular rows and colums of colour filters that sits in front of the sensor. The printing industry found that stochastic printing, in which the dots that make up half tone images was the way to produce a jump in print quality. If we could somehow produce a colour filter array in which the red blue and green filters for each pixel were placed randomly, or at least in a pattern that appears random but is known, then we might see the end of the filters.

Myself, the easier solution would be to make the rgb filter removable - what about creating it with an lcd screen in front of the sensor - you could have your option of three shot green then red then blue exposures for full colour pixels - or no colour so we are doing a black and white sensor, or random but known pattern as needed. Wonder if that is possible. After all, the 7D has an lcd filter in the viewfinder for displaying focus points and leveling.

Who knows where this will lead.

5 comments:

yz said...

well, actually the idea of using red, green and blue filters in three black and white shots to make a colour shot is older than colour photography itself,
originally it was done by a russian photographer called prokudin gorskii about a hundred years ago:
http://www.gridenko.com/pg/index.htm

Ryan Richardson said...

Not only could you vary the filter color of each pixel, you could have variable neutral density for each pixel- making possible both long exposures and compressed contrast for adaptable dynamic range. How about it, science?

George Barr said...

Wow Ryan - that's a great point. How about adjusting the colour of the light before the sensor records it - for example if the light is really blue, the sensor is using a very small part of it's capabilities and in adjusting the colour, the raw processor has to magnify the signal from the areas that should have been warm colours much more and there is more noise. Now you'd get a good balance before processing and minimum noise.

George

George

W. Hutton said...

If it's random, it can't be known.

Spatial aliasing is a fundamental, intrinsic feature of digitization and it affects all analog to digital signal conversion.

You are right, it is not possible to know what is an artifact (moire) and what is not after the data is recorded.

Anonymous said...

Keep on posting such themes. I love to read blogs like this. Just add more pics :)
PatrickJoy