From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Loop triggering based on buffer fill

I am using USB6009s in a relatively simple DAQ configuration. I want to grab, say, 5000 samples at a given time, write those samples to a file, and ideally begin collecting another 5000 samples while I am writing the samples to a file. I'd like the datastream to be as continuous as possible, really I'd like the file write to be triggered once the buffer is filled. So far I have a timed loop with the usual DAQmx suspects, the loop kicking off the data collection and the file write occuring once the data collection is complete. Of course there is no link between loop timing and buffer filling other than basic math to confirm sample rate, sample count and timed loop period all work out to match. This method seems like it can easily drop samples or otherwise lose datastream timing. I'm sure there is a better way, but I can't find a good example that works with this hardware - could easily be looking in the wrong place though. Any help would be greatly appreciated.

 

-Chris

0 Kudos
Message 1 of 12
(3,753 Views)

Do a Web Search for "Learn 10 NI-DAQmx Functions" and you'll find a great White Paper on DAQmx.  It's part of a series NI has written about DAQ.  Among other things, it explains that if you set up your DAQ AI Task to collect 5000 samples at Frequency F, and do it continuously, you'll achieve exactly what you describe, a continuous sampling with the "DAQ Read" function coming up for air after every 5000 samples are ready to write.

 

With this, you want to "export" the data and not try to handle it in the same loop you are using to acquire the data -- you want to take full use of the "free time" while waiting for those 5000 points to be acquired.  One way to do this is with a Design Pattern called the Producer/Consumer (Data) pattern.  To see how this works, open a new VI's Block Diagram, go to File, New ... (the dots are important), and look at the Templates, especially "Producer/Consumer Design Pattern (Data)".

 

Bob Schor

0 Kudos
Message 2 of 12
(3,724 Views)

Bob-

 

Thanks, good feedback. I was referencing that article, but I was seeing logged data time stamps between sampling intervals that looked more like the loop was iterating as quickly as possible and was just pulling the most recent x samples from the buffer. Not sure if this is because I was doing more in the loop (file write) or something else - or is a loop even needed if the daq ai task is set up to be continuous? I'll revisit and check out producer/consumer design patterns, thanks.

 

-Chris

0 Kudos
Message 3 of 12
(3,721 Views)

Re-reading the section on daqmx read (7), it isn't obvious to me that the example they've shown has a loop that is triggered by a buffer filling - is that what is occurring? Like I said above, I had that configuration and time stamps were not as expected

0 Kudos
Message 4 of 12
(3,717 Views)

Chris,

     It is difficult to make suggestions to "invisible code".  Attach the VI that is mis-behaving, and surely one of us can help.  Be sure to attach the actual VI (extension .vi), as we need to be able to modify and run the code.

 

Bob Schor

0 Kudos
Message 5 of 12
(3,672 Views)

Bob-

 

     Thanks, code attached. I think I have the producer/consumer functionality set up and working - at least the timestamps appear to be checking out in my output file. My biggest concern right now is syncronizing the 3x USB6009s that I am using for 24x AIs - a compromised hardware solution for sure, but the best we can do for the time being. Right now I am assuming that they are synced, and if you dig you can see that I am throwing away the timestamps from two of the three DAQs, but I'm guessing there is a way to guarantee that they are sync'ed. You will also see that I am pulling in 4x USB TCO1s that are running at their max 4Hz data rate. I am much less concerned about timesyncing to these (as you can see), but I'd be curious to know if there are better solutions than what I've shown.

 

     Any help appreciated - or feedback in general would be good as well. I've been using LabView sporadically for a long time, but I am probably still less than a beginner.   

 

-Chris

0 Kudos
Message 6 of 12
(3,663 Views)

Only time for a couple quick reactions:

 

1. Your 3 main AI tasks aren't sync'ed at the hw level because they are not driven by any shared timing signals.  At a quick glance, it appears your devices may only offer support for a shared start trigger.  It'd be a good step forward to use that but it won't be a complete solution for syncing.  It can get the tasks started at the same time, but over a long period of time you can expect the sample clocks to show a little bit of skew.  Every so often, one of the tasks will get 1 more sample ahead or behind due to inherent tolerances in their clocks.  Just an FYI.

 

2. I see a lot of 2-input OR nodes feeding into other 2-input OR nodes.  Check out the "Compound Arithmetic" node for an expandable N-input OR operation.

 

3. Your logging loop would be improved if you opened the file once before the loop to get a file refnum, then use the file refnum to do writes while in the loop, and finally close it after the loop.  Very analogous to the structure you already have in place for your DAQ tasks.  The file write function you're using will open, write, and close the file repeatedly for every chunk of data.  All those extra opens and closes are wasteful.

 

 

-Kevin P

 

 

P.S. Your DAQ reading loop and the producer/consumer queue arrangement is essentially sound.  You can expect a continuous data stream from those tasks without losing samples.  The loop timing is governed only by DAQmx Read, all the tasks are running at the same (nominal) rate, you're always requesting the same # samples from each, the queue will grow as needed if the file writes lag behind. All those things help make sure you won't get gaps in you data.

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 7 of 12
(3,659 Views)

Kevin-

 

    Thanks, any help is appreciated. Is there a good example demonstrating how to set up a shared (software?) start trigger? I tried using another example but I ran in to limitations with this hardware. 

    I'll have to look in to this file refnum thing - I thought that what I was already doing avoided multiple opens/closes. 

0 Kudos
Message 8 of 12
(3,652 Views)

The start triggers will need to be shared via physical wiring and screwdriver work.  I would do it by generating a pulse on a DO pin right after starting the AI tasks.  I think there's only one specific pin that can be configured as an AI Start Trigger, you'll have to connect all those to the DO pin.

 

As to the file writing, the 'append' input makes it *seem* like it won't keep re-opening and closing the file, but drill down into it and you'll see that it does.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 9 of 12
(3,646 Views)

Your example seem reasonable except a few remarks, the write to spreadsheet I think opens and close the file at every call, not very performant, also the 3 devices dont share a common timebase so they might get out of sync and you might get samples overwrites, why dont you create 3 separate loops for the devices and 3 separate queues (they'll have each it's own timestamp anyway)?

 

 

0 Kudos
Message 10 of 12
(3,635 Views)