07-02-2010 07:23 AM
Hello all!
As I didn't find a better board I've posted this signal processing and optimization question to the general Labview board.
I've been performing some experiments related with single-pixel imaging, following the work presented by the Rice Digital Signal Processing group (http://dsp.rice.edu/cscamera). I've used Matlab and l1-magic toolbox (http://www.acm.caltech.edu/l1magic/) to acquire the measurements and reconstruct (via l1-norm minimization) my images, respectively and everything went fine.
But as I'm acquiring these measurements with a NI-DAQ I'd like to take advantage of LabVIEW to perform these same experiments.
I've recently found a compressive sensing demo package (http://decibel.ni.com/content/docs/DOC-3769) implemented in LabVIEW and I've tried to use it to reconstruct images with my single pixel camera.
The problem is that this demo is iterative and tries to minimize its reconstruction/estimation on the run and it knows the original signal that it wants to obtain from few measurements.
With my setup I don't know a priori the image I'm acquiring and the problem is ill-conditioned since I perform less measurements than the image's total number of pixels. This is why optimization comes into the game.
Has someone ever implemented a convex optimization algorithm to minimize the l1-norm in LabVIEW? What I need is a labview VI to perform the same as the l1-magic toolbox does in matlab...
Any help on this would be greatly appreciated.
Thank you very much in advance.
Best regards,
Filipe Magalhães
07-07-2010 09:24 AM
Hello.
I wasn't able to find any VI doing that directly.
But here's a link to the Vision module documentation that may help you to find addtional information about image processing.
Best regards