I have just fallen into a hidden trap during using a point by point standard deviation VI what comes with LabView. These such VIs have an input terminal, where we can specify the puffer size of the samples (sample length), what this VI calculates SD from. The problem is, if we start for example with puffer size 5, and we want to modify this value to 100, it does not update the VI. It keeps using value 5!
If the queue gets full, this VI does not except any new data on the "sample length" input terminal. Of course, if you initialize the VI, it starts to use the new puffer size. But only then.
Maybe I was just not aware, I thought changing the puffer size is enough.
In my opinion, the above hidden trap should be stated in the help of the VI, or the VI should be modified so when the sample length value changes, the VI should behave properly.
Solved! Go to Solution.
Did you try toggling the initialize input from T back to F along with the sample length change? By 'puffer' did you actually mean 'buffer'?
I guess you did not read my entire post. I do know that, I can initialize the VI. But in my opinion, changing sample length value should always change the inner state of this VI.
And yep, 'puffer' is my "English" level... 🙂
I've seen two ways people handle the problem of "vi in vi.lib doesn't work how I want":
1. Change the vi in vi.lib.
2. Create your own version of the vi, and only use that one.