LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

separate thread to monitor PXI-6115

I am using a PXI-6115 multifunction analog I/O card to monitor a power supply to assure regulation. I am monitoring 4 different points with a resolution of 1000S/s each. I would like to check each point and determine whether it is withing an acceptable window of operation both above and below the voltage it is supposed to be operating at. If it is out of regulation for a specified period of time, I want to write a chunk of the waveform to a spreadsheet file for analysis.

I have been able to successfully complete a majority of this. The only problem is that I want all of this to happen in the background so that I can also execute numerous other vi's to change various parameters, etc. in order to stress the supply. However, I run
into problems in that some of the voltage levels are customizeable and therefore the thresholds must be changed in order to check the new voltage for regulation. When I change these thresholds, they change immediately, however the ni-daq "read" vi is sometimes a couple seconds behind. So... I am comparing the old voltages with the new thresholds and it tells me that my supply is out of regulation when it is functioning correctly, I am simply looking at old data.

My question is...
Is multithreading this somehow the best solution to this problem? Should I create a separate thread for my supply monitoring vi and give it a higher priority than my test thread in order to service the AI Read vi often enough to keep the buffer caught up and always be reading current data, or is there a non-buffered AI vi that will work better for this situation. Or am I simply asking too much to have it read 4000 S/s and make comparisons and log data all in the background while doing other stuff?
0 Kudos
Message 1 of 5
(2,318 Views)
It sounds like the comparison of the acquired data to the thresholds is not happening as efficiently as it could. The reason is most likely in the implementation of your comparison algorithm. Something is keeping your acquisition from keeping up with the live input. The DAQ Read function is pulling scans from deep in its buffer.

4kS/sec is relatively slow for DAQ applications. Multithreading is helpful, but it cannot protect you from non-optimal code. If you can post your VIs here, I�ll take a look at it and see what might be causing the problems.

Daniel L. Press
Certified LabVIEW Developer
PrimeTest Corporation
www.primetest.com
0 Kudos
Message 2 of 5
(2,318 Views)
I am currently revising my comparison algorithm to try to make it a little more efficient, but I still think that devoting more processor time to this is the solution being that this is time critical while the controlling of the different variables and such in the foreground is not necesarily a high priority. All I am doing is a simple greater than or less than comparison on each element returned from the read and using a bounded count on whether or not it is "in bounds" if it is out of bounds long enough for the bounded counter to go beyond the user specified value, the input for that second is appended to an array that will later be written to a spreadsheet for analysis. It is nothing too elaborate, however, being that it will be executed up to 4000 time
s per second potentially, it will require some attention from the processor. I am looking for any way to assure that my routine gets executed often enough, preferably to totally empty the buffer once per second.
0 Kudos
Message 3 of 5
(2,318 Views)
For that comparison, I would use an "In Range and Coerce" function with my data ARRAY wired to the center terminal and my high and low bounds wired to the top and bottom. The output will be a boolean array. Invert the boolean array by wiring in an Invert operator. Wire the inverted boolean array to a "Boolean to (0, 1)" function. Wire the resulting integer array (of 0s ans 1s) to an Add Array Elements function. It will output a count of how many points in the array were out of range.

If you're using a FOR loop for this, you're wasting processor cycles.

I hope that helps.

- Dan
0 Kudos
Message 4 of 5
(2,318 Views)
Thanks for the help...

The only problem with this approach is that the signals are fairly noisy... so the idea is to only consider it an error if a certain number of samples in succession are "out of bounds." In order to perform this check a FOR loop is necessary in order to keep a count of successive out of bounds samples correct?

Thanks for all the help

-Jake
0 Kudos
Message 5 of 5
(2,318 Views)