Hi,
I encountered something that IMHO is a bug, or at least wrong behaviour. When displaying a large array of (very noisy) data in an intensity graph it shows wrong apparent intensities. As an example, the cross visible in the top left image is NOT a higher average intensity, although it appears to be so in the intensity graph, there is only a stronger noise component. When zooming in the effect dissappears. The reason seems to be that, when the intensity graph downsamples the data to fit on the screen it keeps the local MAXIMUM value. I simulated this in the left bottom image where I downsample the image, keeping the maximum value of each block of pixels. The IMAQ indicator behaves correctly, as does my own implementation of downsampling using decimation and randomizing (bottom right). You can try for yourself with
this vi, if you generate a large array (say 1Kx1K) of random numbers between 0 and 1 and plot it in a relatively small intensity graph you expect to see gray or an equal mixture of black and white, instead you get an even white, indicating an average value of 1 instead of the correct 0.5. Any opinions or solutions are greatly appreciated.
Manu.
Message Edited by mkdieric on 08-11-2005 03:22 AM
Certified LabVIEW Developer (CLD)