Hi! On 1/12/06, M. Adam Davis wrote: > > How the camera captures the image is relevant, but your explanation > does not prove that one could not obtain sub-pixel resolution from > multiple shots of the same subject. > > I don't see a reason why it's not possible. Well, the problem really is much more complex, and related to filtering. First, an example. You take a photo of a pattern of white vertical lines on a black background, and each pixel takes exactly 1 line and the background surrounding it. You see each pixel with the same value (at half the brightness of the line). If you move the camera to the side, always you get 1 line per pixel (with some line entering from the left as another leaves at the right), so you always obtain the same image. How can you reconstruct the original pattern? Into the details, you can describe the process like: * The original image ("infinite resolution") goes through the lenses system. The lens actually low-pass filters the image, convolutioning the image with the diffraction spot of the lens. * Then, the image is sampled by the sensor, using rectangular pixels. This applies another filter to the image (convolution with a box filter), and then aliases the remaining high frequencies to the lower bands. Now, the aliasing effect can be reduced taking new images (of the *same* data) with the sensor at another (fractional-pixel) position. An then you can equalize the new image using an optimal inverse filter. Problem is, there are *some* frequencies that are highly attenuated by the filters (even some that are zeroed), so you can not restore the original information. To solve this problem, you can move the camera perpendicular to the image, so the sampling frequency changes. But the reconstruction process can be *very* difficult, and this can only be done if the image is very far away, so moving the camera don't change the scene imaged. Daniel. -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist