Much is made of image quality when no low pass filter is used in front of a digital sensor - as per Leica, medium format backs, and Foveon sensors. It's always been assumed that once pixel counts got high enough, the fuzzy filter which blurs the image so patterns like cloth don't create moire patterns in the Bayer algorithm interpretation of the sensor data would no longer be necessary. Despite this the Nikon D3X has one at 24.5 megapixels, the Canon 5D2 at 21.5 megapixels has one, so it isn't clear when we are going to be able to drop it. People talk about better software solutions for removing moire but the reality is it is darn hard to know what is a repetitive pattern and what is an artificial effect of the interpolation.
In reality, the problem isn't in the math, it is in the regular rows and colums of colour filters that sits in front of the sensor. The printing industry found that stochastic printing, in which the dots that make up half tone images was the way to produce a jump in print quality. If we could somehow produce a colour filter array in which the red blue and green filters for each pixel were placed randomly, or at least in a pattern that appears random but is known, then we might see the end of the filters.
Myself, the easier solution would be to make the rgb filter removable - what about creating it with an lcd screen in front of the sensor - you could have your option of three shot green then red then blue exposures for full colour pixels - or no colour so we are doing a black and white sensor, or random but known pattern as needed. Wonder if that is possible. After all, the 7D has an lcd filter in the viewfinder for displaying focus points and leveling.
Who knows where this will lead.