From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Saving data in 30 or 40 seconds intervals with millisecond accuracy

Solved!
Go to solution

Dear all, 

 

For a rewrite project my lab is working on, we are trying to take in analog voltage inputs at 15k Htz. We would then like to save this data for a specified amount of time, i.e. 40.00 seconds, then stop recording for another specified inter-trial interval, i.e. 20 seconds, and then start again. I have looked around the forum for ways to accomplish this, but it seems like most of the discussed ideas give precision up to about 1 second. For our lab, it's crucial that we have time precision down to 1ms.

My initial idea was that since DAQmx read uses hardware timing (I think?) to time the 15k Htz, it may be a good idea to program the following logic: We will record for 40 seconds, so let's say 15k*40 samples, and then rest for 20 seconds, so let's throw away the next 15k*20 samples, and so on.

The three problems I can think of off the top of my head are:

1) DAQmx's read function doesn't put data into buffer 1 by 1, so it may be difficult to always save the exact number of samples? i.e. if DAQmx removes data from buffer by 1000 samples/iteration, it may be hard to save exactly 212,415,500 samples, for instance.

 

2) I am not sure if LabView has a built in save VI that allows me to specify the number of samples I want to save. Besides, these "samples" are aggregated into a waveform data type. (well, a 1D array of waveforms since we are doing more than 1 channel at a time)

 

3) The program also needs to do timed Digital Output, if we save data using this "counting by the sample" mechanism, timing the DO logic may have to be another seperate logic(?).

 

Please tell me what you think of the idea. If it's no good, it would be great if one can steer me towards a better approach.

 

Oh, this is my first time coding in LabView, I am open to all kinds of style/programming suggestions/feedback.

 

As always, thanks for your help!

0 Kudos
Message 1 of 14
(5,729 Views)

Sorry, there was one more subVI that I couldn't attach. Here it is:

0 Kudos
Message 2 of 14
(5,728 Views)

You asked for comments.

  • Don't use Notifiers for a Producer/Consumer design -- they don't guarantee that elements are not missed.
  • I don't understand the idea of the double-switches.
  • I don't understand why you are using so many references instead of the controls/indicators themselves.
  • I don't understand why you build an array of booleans and then OR the array instead of using the "Compound Arithmetic" function from the Boolean palette to just OR the booleans directly.
  • I don't understand why your Notifier/Queue uses a Variant instead of using an Array of Booleans directly.  "Being Honest" saves time and promotes clarity.

Now for the question at hand.  You may know that although LabVIEW has Timing Functions, and Time is something that LabVIEW treats directly, when running in Windows, you cannot depend on LabVIEW's timing functions (themselves) for millisecond precision.  However, dedicated acquisition hardware (including "PC-like" devices like PXI controllers, cRIOs, etc. that have Real-Time OS's) can clock things to the millisecond.  So you could, in principle, use your DAQ device (what is it, by the way) as a "clock".

 

Suppose you set up a Producer/Consumer loop properly (with a Queue) and had the Producer "driven" by, say, a DAQ device producing 1000 samples at 1KHz.  It would be functioning like an accurate 1-second "clock".  To take 40 seconds of data, wait 20 seconds, and take 40 more seconds of data, you could do the following:

  1. Set the Producer up to take 1000 samples at 1KHz, becoming a 1 second "clock".
  2. Make the Consumer a State Machine that stays in the "Save this to disk" State for 40 "ticks", transitions to the "throw the data away" State for 20 ticks, then goes back to the "Save Data to Disk" State.  You'll need to open a uniquely-identified File for the first of the 40 Save States, and close the file at the start of the 20 "Discard Data" States.

It may be a good idea to think a bit more about this State Machine.  The more I think about it (without benefit of paper and pencil, so I'm only using a fraction of my feeble brain), the more I see the possibility of more States, including "Wait for Producer", "Open New File", "Save To Disk", "Close File", "Discard Data".

 

Have you thought about how you stop the Producer and Consumer pair?  I, personally, am fond of using a Sentinel, a recognizable "signal" that the Producer (who knows when it is time to stop "producing") sends to the Consumer.  When the Producer sends the Sentinel, it can exit.  Meanwhile, the Consumer just goes ahead "consuming" until it sees the Sentinel, at which time it exits and, since the Producer has already sent the last (Sentinel) item and has exited, it Releases the Queue, no longer needed.  Both loops stop, no errors are generated.

 

Bob Schor

Message 3 of 14
(5,672 Views)

Hi,

Appreciate the detailed response and comments.

The DAQ device we are using is a NI PCIe-6535.

I chose to use notifiers because I specifically only wanted the latest element. Namely, the newest DISPLAY/SAVE setting.

I had to use the references because the Boolean button binding logic subVI works on the buttons and then alters the state of the buttons. The logic is: if SAVE goes from OFF to ON, and DISPLAY is OFF, Display will be turned ON as well & if DISPLAY goes from ON to OFF, and SAVE is ON, SAVE will be turned OFF as well. Basically, there should never be a situation where SAVE can be ON. without DISPLAY being ON.

As for the Variant thing, I based off the Producer/Consumer off of an example, so I definately can go in and fix the redundancy.

I really like the idea of using the PCIe-6535 to time the loops, can you tell me which subVI's are involved with using the DAQ device as my timing device?

Thank you for mentoining the Sentinel, I was actually also worried about stopping the process properly before we restart another experiment.

 

 

 

0 Kudos
Message 4 of 14
(5,662 Views)

To use the DAQ device as a Clock, proceed as follows:

  1. Set the DAQ device to acquire at some sampling rate F (in Hz, e.g. F = 1000 for 1KHz), set the Sample Size also to F, and set the mode to Continuous.
  2. Start the DAQ and enter the While Loop.
  3. Do a DAQ Read of F Samples (do not specify -1!).  This will cause the While loop to "block" for a second, until all of the Samples have been acquired.
  4. Put the Samples into the Producer Queue (do not use a Notifier, use a Queue).  Note that the DAQ Loop is functioning as a Producer, driving the Consumer that "does stuff" with the data.
  5. At this point, you are finished with the Producer loop.  When the While "loops", it will again wait a second, regular as clockwork.
  6. I also ready discussed a Consumer State Machine, with a Wait for Data State, which (when entered) will be "clocked" at 1 second intervals by the Producer.  So if the Wait increments a Shift Register, it can count the seconds for you, deciding whether the next State should be Open File (if the Clock is 0, 60, 120, etc.), Save (if the Clock is 1..40, 61..100, etc.), and Discard (if Clock is 41 .. 59, 101 .. 119).  Note that Open File is immediately followed by Save, Save is followed by Wait (unless Clock is 40, 100, etc., in which case it is followed by Close, which is followed by Wait), and Discard is followed by Wait.

I think that's right -- use Pencil and Paper and convince yourself.

 

Bob Schor

Message 5 of 14
(5,657 Views)

I have to confess I don't understand, at all, what those buttons are doing, but if it makes sense to you, OK.

 

BS

0 Kudos
Message 6 of 14
(5,656 Views)

Hi,


Thanks. My understanding is that by doing what you described, I would be able to time my operations (saving, reading, digital output) by the second, with great precision, exceeding to the millisecond. 

What if I wanted to be able to manupulate the outputs to the millisecond though? i.e. wait for 40.001 second. If my sampling rate has to be 15k Htz, doesn't that mean I need to set the Samples to acquire to be 15 samples? But then, I am counting on the fact that each iteration of the producer loop would take less than 1ms, which seems like a big IF.

 

0 Kudos
Message 7 of 14
(5,651 Views)

@RaymondLo wrote:


Thanks. My understanding is that by doing what you described, I would be able to time my operations (saving, reading, digital output) by the second, with great precision, exceeding to the millisecond. 

What if I wanted to be able to manupulate the outputs to the millisecond though? i.e. wait for 40.001 second. If my sampling rate has to be 15k Htz, doesn't that mean I need to set the Samples to acquire to be 15 samples? But then, I am counting on the fact that each iteration of the producer loop would take less than 1ms, which seems like a big IF.

 


Are you kidding me?  The DAQ card is plugged into your PC, so it takes microseconds, at most, to transfer the 15 samples.  Maybe it takes another microsecond (on a really, really slow PC) to enqueue it.  Let's see, 1000 microseconds - a few microseconds leaves about 99% of the time in the Producer Loop free, with nothing to do but loop around and wait for the next tick.

 

I've got a Real-Time data acquisition/control routine that runs on just such a millisecond Clock.  I'm actually running on a PXI, and I "invert" the clock by having it run a hardware-timed loop at 1 msec, and when it "ticks", I do Analog and Digital samples, plus I "synthesize" a Clock from the Loop Index (checking to be sure I don't miss any clock ticks) that gets saved with the data stream.

 

Note that you might not want to run a disk-writing routine at 1KHz (once a second makes more sense, here), but there's nothing to stop you from cascading Producer/Consumer loops.  You can have your DAQ Producer sending at 1Khz to an intermediate Sample-processing loop that accumulate data-to-be-saved, then enqueues them to a Writing Consumer.  A bit more complicated, but then so is your task ...

 

BS

Message 8 of 14
(5,644 Views)
I see. But then in this same while loop, I will have to tick check to see if it's time for digital output.
I guess I just thought it takes much more than 1ms for a iteration because I had a indicator showing the iteration count and it seemed to have taken a long time for each iteration. Maybe the incrementing and displaying I did dramatically slowed t down though... I remember System.out.print in a while true loop dramatically slowed down a program in Java so this could be the same thing.
At any rate, I will give this a go. Thanks!
0 Kudos
Message 9 of 14
(5,638 Views)

Hi,

 

I tried implementing the idea of using the DAQ read loop (and hence the PCie device timer) to time the other loops. I am running into the problem where if I set the sample to be acquired as 15 with 15k Htz sampling rate (1ms ticks), sometimes, the scan backlog would exceed 15 ( around 40 or 50). Doesn't this mean that for a few clicks, the while loops were too slow? Also, with 1ms clicks and the datalogging placed in the consumer loop (using a queue structure), the updating of the waveform chart is way too slow.

 

So I then tried setting F to be 150 (so 10 ms ticks now). For the most part, scan backlog doesn't exceed 150, but there was indeed one instance when the backlog went up to about 200 when I kept the DAQ running and simultaneously opened up another LabView VI.

 

Also, what are some good ways to check to see if I am reaching the desired level of timing accuracy?

 

Thanks again!

0 Kudos
Message 10 of 14
(5,591 Views)