hi all,
i am reading waveforms from an oscilloscope. once the wave form is flat, the computer will ask one pulse generator to send out a pulse. to determine whether the waveform is flat, I use the standard deviation as an indicator: if the standard deviation exceeds 20% of the average, the computer will treat it as not flat.
I saw there seems to be an already-written VI for this. but I cann't find it. and I also wonder how long it will take to caculate standard deviation for 500 data points
thanks,
tao