LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Force Sensor Data Collection Delay

Hello all,

 

I recently posted here: https://forums.ni.com/t5/LabVIEW/Sampling-rate-problem-with-USB-6211-DAQ-and-PID-VI/m-p/3961463?prof... about a sampling rate problem that I was having with my program, thank you to those who responded! Using those suggestions, the altered program (attached as Force Control.vi) is now able to sample at a rate up to 50 Hertz (the frequency of the servo).

 

However, this force control code is apart of a larger program which involves the use of a camera (Microsoft Lifecam Studio webcam) to track the diameter of the object exerting force on the force sensor (sample picture of the camera output is attached). When this camera code is added into the program with the force sensor code, there is a delay in the response of the force sensor. For example, if an impulse is exerted on the force sensor, the program will not record this impulse until about 1 second after it happened in real-time. This delay increases as the sampling rate increases. Any ideas on why this could be happening? 

 

I’ve attached the full code (Linear Force Control.vi) as well as the code for the camera readings (Camera Only.vi).

 

Thanks in advance!

 

P.S. I’ve inherited the camera code so apologies if there are any errors in the code.

0 Kudos
Message 1 of 2
(1,921 Views)

Two main points:

1. You should have a separate loop for all the camera and vision stuff.  You aren't using any of the info for control so there's no need to slow down your DAQ & PID loop with it.

 

2a. As you've found, when your loop rate can't keep up with your data acq rate, your choice to read 1 sample at a time can come back and bite you.  As your loop runs slower than the sample rate of the task, you build up a backlog of DAQ data and keep picking off the 1 single oldest sample.

    Because you've set your nominal loop rate equal to your sample rate via the use of "Wait (msec)", you're also preventing yourself from catching back up.  A better method when using "Wait (msec)" is to request *multiple* samples per loop iteration.  Specifically, wire in the special value -1 which means "read all available (a.k.a. backlogged) samples."

   Nominally, you can expect most iterations to give you a 1 sample.  But if you do fall behind briefly, this approach will catch you back up.

 

2b. Alternately, you could remove the "Wait" entirely and let the 1-sample call to DAQmx Read drive your loop timing. 

 

Either timing approach could work fine with a desktop card.  I'm less sure whether they'll be similarly reliable over USB.  USB does pretty well at bulk streaming, less well at low-latency frequent transmission of small packets of data.  It's not a good choice for real-time control, so realize ahead of time that you may hit a wall due to the choice of USB.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 2 of 2
(1,878 Views)