01-04-2005 09:00 AM
01-04-2005 11:18 AM
01-05-2005 03:10 AM
01-05-2005
07:49 AM
- last edited on
11-25-2025
12:33 PM
by
Content Cleaner
Thanks for the plug, ien_fr. You may want to check out the updated version of that PDF. Search for “Large Data in LabVIEW” at the main ni.com page or try this link.
To get back to the subject at hand, neona is correct. It is more difficult to decimate a 2D array. If you want zooming capabilities, decimation makes your life more difficult, since you need to redecimate every time you change your zoom factor.
You have a couple of options for decimating a 2D array. First, you can implement a 2D version of the max-min decimation algorithm fairly easily. Your decimation chunks will have both X and Y extent determined by the pixel dimensions of you intensity plot area. I am not sure you will be happy with the results, but it should only take a couple of hours to try. Go to the link above and modify the decimation routine to use 2D arrays instead of 1D. You will need to create a double loop structure instead of the currrent single loop structure currently used for 1D.
Second, recommended, you can leverage the enormous amount of effort put into this problem by the image analysis folks. Your 2D data set is the equivalent of a grayscale image (2D array of values). Find a library of image manipulation routines (e.g. Image Magick - http://www.imagemagick.org/), call from LabVIEW to do the resize, then plot the result to the intensity graph (or picture control). You can make the call to the image manipulation routines using either the call library node (for DLLs) or the System Exec.vi for command line utilities.
Good luck. If you find a good solution, please post it so we can all use it. This subject comes up every once in awhile. Thanks!