LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

A (should be) simple DAQmx question - take finite samples in software timed loop

Solved!
Go to solution

Usually I use separate DAQmx loops exclusively for the task, and send the data via a Queue. In such cases I simply use continuous sampling, very simple solution.

However, I wonder what is the "official" procedure when we want to take finite samples iteratively in a loop, which is software timed and contains other (RS232, etc) tasks. So simply I want to sample like 100 values per channel per iteration, and keep the loop iterating at a certain speed (for example 1000 msec, 1-10 msec jitters are ok in this case). I know that the other tasks will not take more than 400-500 msec to execute. So it is convenient to keep a 1000 msec iteration speed.

Please have a look the (non finished code) snippet below, is it ok to setup it this way? I always realize my DAQmx knowledge is just way too limited 🙂

Thanks!

 

daqmxq.png

0 Kudos
Message 1 of 16
(7,714 Views)

You don't actually need the Start Task and Stop Task.  Otherwise, I don't see anything actually wrong.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 16
(7,678 Views)

I usually set Commit state with Control task.vi before the loop. It moves some task initialization steps outside the loop. 

Message 3 of 16
(7,663 Views)

I'd definitely get rid of the wait inside the loop.  The DAQ Task waits until samples to read are acquired and throttles the loop perfectly.  Adding a second clock source is just asking for trouble.  The words of Ben Franklin come to mind.  A man with one watch knows what time it is, a man with two is never quite sure.  

 

You know you should just do it my way.Smiley LOL


"Should be" isn't "Is" -Jay
0 Kudos
Message 4 of 16
(7,642 Views)
Solution
Accepted by topic author Blokk

I tend to go with a different approach which is a little more flexibile in a particular way.  It can be really useful sometimes, but can also be unimportant or even detrimental.  It just depends.

 

When I need snapshots of discontinuous data, I configure for continuous sampling and for allowing overwrite.  I also configure to read relative to the sample being taken *right now* instead of relative to wherever my last read left off.  When set up this way, I can immediately retrieve a fixed # of samples representing the most recent stuff that was happening at the sensor.  And during the times I'm not peeking at the data, the task will happily overwrite the buffer in the background without throwing errors.

 

I've never had a problem with this approach being a noticeable detriment, but can imagine how continuous streaming could waste resources in apps where data peeks are relatively rare.

 

Below is a code snippet to illustrate how to do what I'm talking about:

 

 

-Kevin P

  

Config AI to read recent.png

 

 

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 5 of 16
(7,640 Views)

@JÞB wrote:

I'd definitely get rid of the wait inside the loop.  The DAQ Task waits until samples to read are acquired and throttles the loop perfectly.  Adding a second clock source is just asking for trouble.  The words of Ben Franklin come to mind.  A man with one watch knows what time it is, a man with two is never quite sure.  

 

You know you