I am working on an application that uses a handheld Unibrain Firewire camera. The camera captures 30 frames per second. I need to detect the direction and (ideally) the speed with which the user is moving the camera relative to the surface that the camera is looking at. Although there are particles that can be detected on the surface, there is no regular pattern across the surface. Furthermore there may be SEVERAL particles that all appear very similar within a given frame.
All processing is done using LabVIEW 8.2 (soon to be upgraded to 8.5), IMAQ, NI Vision etc.
The additional challenge is that, although the user does their best to maintain the distance of the camera at around 0.5 in., there is no way to fix that distance of course.
Any ideas? I was told that the way to do this was by cross-correlating images from subsequent frames but I have been unable to find an appropriate example.
Thanks for your help!