Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

USB-6001, maximizing Analog In/Out throughput

Hello all,

 

I'd like to use a USB-6001 to read an analog voltage, apply a gain and offset in LabVIEW, and output the new voltage as quickly as possible. This would be used to condition a position feedback signal for use in a real time 1kHz control loop. This will of course introduce some phase lag, but hopefully not a significant amount. Is this a reasonable idea? How can I best optimize the DAQmx VIs for speed? I'm running in Continuous Mode, but am unsure how have to best setup the clocks and buffers. I also seem to be able to specify a max sample rate of only 5000S/s, i thought this device was good for 20,000S/s?

 

Thanks,

Dan

 

0 Kudos
Message 1 of 4
(3,931 Views)

Sorry, but you really aren't going to get there from here with your simple USB board.  There's a lot of different reasons, here's some:

 

1. USB has too much latency for a 1 kHz control loop.  You'd need something more like a desktop PCI-express card to get your latency down.

 

2. Even then, Windows will not reliably run a 1 kHz control loop.  It may *often* meet the 1 msec loop period target, but you won't be able to count on it.

 

3. Continuous buffered input is an option, but your output needs to *not* be buffered.  Buffered output is another source of latency -- the time between calculating and writing a sample value to the buffer and when it's that sample's turn to be D/A converted.

 

 

More info:  the 5000 S/s limit may be caused by having multiple AI channels in your task.  The 20000 S/s is a max, the interval needs to be divided among the channels.  I'm guessing you have 4 (or possibly 3) AI channels in your task.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 2 of 4
(3,919 Views)

Thanks, Kevin.

 

Indeed, i just ran the setup using a 5Hz sinewave as input and the output had a varying 35-175degrees of phase lag, mostly on the high end unfortunately. Sample rate and buffer size did effect the lag greatly.

 

More about my setup:

I'm running an external PID controller, I'd like LabView to process the position feedback voltage with a gain and offset. Example, the input could be a +/-10V SE voltage, I need the output to be a 0-10V SE voltage.

 

This was an attempt at an "easy" solution before i rig up some opamp circuits on a breadboard. What DAQ hardware, if any, would make this feasible? Our lab could use such a gadget to process analog signals instead of designing and building custom circuits each time.

 

Thanks again,

Dan

0 Kudos
Message 3 of 4
(3,916 Views)

If I understand you right, you're running a slower outer control loop to compute a varying setpoint and then you've got an external PID controller that's running a faster loop to lock onto and track your setpoint.  Hopefully, such a setup is a little more forgiving of timing variability in your software loop.

 

As to DAQ hardware, I'd say offhand that one of the PCI-express X-series multifunction boards in a desktop PC would give you quite a lot of varied capability.  For better advice, have a conversation with the NI technical sales person assigned to your area.  They'll know more details about a broader range of options for you to consider.

 

With programming care, a desktop plugin DAQ board can probably do pretty well into the low 100's of Hz and kinda sorta ok as you approach 1 kHz.  Beyond that, you're increasingly at the mercy of the OS.  It always thinks it has lots of other things to do for you and was not designed to parcel out tiny little sub-msec chunks of processer time in a repeatable & reliable way.

 

It isn't that LabVIEW itself is slow to execute.  Simple math loops can run at speeds well into the MHz.  When you bring in hardware, the DAQmx driver calls will start costing maybe 10's of microsec, dropping you to a max loop rate in the 10's of kHz.  But you'll sometimes get starved out by the OS and find an iteration that takes as much as 10's of millisec.

 

 

You'll also likely need some live display, data logging, and other things that will further chew into your realistic looping speeds.  In the LabVIEW world, the next step toward reliable control loop timing would be either LabVIEW Real-Time or LabVIEW FPGA.  Just realize that each has a distinct additional learning curve even after being proficient with regular LabVIEW.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 4 of 4
(3,901 Views)