06-23-2016 10:39 AM - edited 06-23-2016 10:41 AM
Dear all,
For a rewrite project my lab is working on, we are trying to take in analog voltage inputs at 15k Htz. We would then like to save this data for a specified amount of time, i.e. 40.00 seconds, then stop recording for another specified inter-trial interval, i.e. 20 seconds, and then start again. I have looked around the forum for ways to accomplish this, but it seems like most of the discussed ideas give precision up to about 1 second. For our lab, it's crucial that we have time precision down to 1ms.
My initial idea was that since DAQmx read uses hardware timing (I think?) to time the 15k Htz, it may be a good idea to program the following logic: We will record for 40 seconds, so let's say 15k*40 samples, and then rest for 20 seconds, so let's throw away the next 15k*20 samples, and so on.
The three problems I can think of off the top of my head are:
1) DAQmx's read function doesn't put data into buffer 1 by 1, so it may be difficult to always save the exact number of samples? i.e. if DAQmx removes data from buffer by 1000 samples/iteration, it may be hard to save exactly 212,415,500 samples, for instance.
2) I am not sure if LabView has a built in save VI that allows me to specify the number of samples I want to save. Besides, these "samples" are aggregated into a waveform data type. (well, a 1D array of waveforms since we are doing more than 1 channel at a time)
3) The program also needs to do timed Digital Output, if we save data using this "counting by the sample" mechanism, timing the DO logic may have to be another seperate logic(?).
Please tell me what you think of the idea. If it's no good, it would be great if one can steer me towards a better approach.
Oh, this is my first time coding in LabView, I am open to all kinds of style/programming suggestions/feedback.
As always, thanks for your help!
Solved! Go to Solution.
06-23-2016 10:40 AM
Sorry, there was one more subVI that I couldn't attach. Here it is:
06-23-2016 08:14 PM
You asked for comments.
Now for the question at hand. You may know that although LabVIEW has Timing Functions, and Time is something that LabVIEW treats directly, when running in Windows, you cannot depend on LabVIEW's timing functions (themselves) for millisecond precision. However, dedicated acquisition hardware (including "PC-like" devices like PXI controllers, cRIOs, etc. that have Real-Time OS's) can clock things to the millisecond. So you could, in principle, use your DAQ device (what is it, by the way) as a "clock".
Suppose you set up a Producer/Consumer loop properly (with a Queue) and had the Producer "driven" by, say, a DAQ device producing 1000 samples at 1KHz. It would be functioning like an accurate 1-second "clock". To take 40 seconds of data, wait 20 seconds, and take 40 more seconds of data, you could do the following:
It may be a good idea to think a bit more about this State Machine. The more I think about it (without benefit of paper and pencil, so I'm only using a fraction of my feeble brain), the more I see the possibility of more States, including "Wait for Producer", "Open New File", "Save To Disk", "Close File", "Discard Data".
Have you thought about how you stop the Producer and Consumer pair? I, personally, am fond of using a Sentinel, a recognizable "signal" that the Producer (who knows when it is time to stop "producing") sends to the Consumer. When the Producer sends the Sentinel, it can exit. Meanwhile, the Consumer just goes ahead "consuming" until it sees the Sentinel, at which time it exits and, since the Producer has already sent the last (Sentinel) item and has exited, it Releases the Queue, no longer needed. Both loops stop, no errors are generated.
Bob Schor
06-23-2016 08:39 PM
Hi,
Appreciate the detailed response and comments.
The DAQ device we are using is a NI PCIe-6535.
I chose to use notifiers because I specifically only wanted the latest element. Namely, the newest DISPLAY/SAVE setting.
I had to use the references because the Boolean button binding logic subVI works on the buttons and then alters the state of the buttons. The logic is: if SAVE goes from OFF to ON, and DISPLAY is OFF, Display will be turned ON as well & if DISPLAY goes from ON to OFF, and SAVE is ON, SAVE will be turned OFF as well. Basically, there should never be a situation where SAVE can be ON. without DISPLAY being ON.
As for the Variant thing, I based off the Producer/Consumer off of an example, so I definately can go in and fix the redundancy.
I really like the idea of using the PCIe-6535 to time the loops, can you tell me which subVI's are involved with using the DAQ device as my timing device?
Thank you for mentoining the Sentinel, I was actually also worried about stopping the process properly before we restart another experiment.
06-23-2016 09:09 PM
To use the DAQ device as a Clock, proceed as follows:
I think that's right -- use Pencil and Paper and convince yourself.
Bob Schor
06-23-2016 09:10 PM
I have to confess I don't understand, at all, what those buttons are doing, but if it makes sense to you, OK.
BS
06-23-2016 09:19 PM
Hi,
Thanks. My understanding is that by doing what you described, I would be able to time my operations (saving, reading, digital output) by the second, with great precision, exceeding to the millisecond.
What if I wanted to be able to manupulate the outputs to the millisecond though? i.e. wait for 40.001 second. If my sampling rate has to be 15k Htz, doesn't that mean I need to set the Samples to acquire to be 15 samples? But then, I am counting on the fact that each iteration of the producer loop would take less than 1ms, which seems like a big IF.
06-23-2016 09:41 PM
@RaymondLo wrote:
Thanks. My understanding is that by doing what you described, I would be able to time my operations (saving, reading, digital output) by the second, with great precision, exceeding to the millisecond.What if I wanted to be able to manupulate the outputs to the millisecond though? i.e. wait for 40.001 second. If my sampling rate has to be 15k Htz, doesn't that mean I need to set the Samples to acquire to be 15 samples? But then, I am counting on the fact that each iteration of the producer loop would take less than 1ms, which seems like a big IF.
Are you kidding me? The DAQ card is plugged into your PC, so it takes microseconds, at most, to transfer the 15 samples. Maybe it takes another microsecond (on a really, really slow PC) to enqueue it. Let's see, 1000 microseconds - a few microseconds leaves about 99% of the time in the Producer Loop free, with nothing to do but loop around and wait for the next tick.
I've got a Real-Time data acquisition/control routine that runs on just such a millisecond Clock. I'm actually running on a PXI, and I "invert" the clock by having it run a hardware-timed loop at 1 msec, and when it "ticks", I do Analog and Digital samples, plus I "synthesize" a Clock from the Loop Index (checking to be sure I don't miss any clock ticks) that gets saved with the data stream.
Note that you might not want to run a disk-writing routine at 1KHz (once a second makes more sense, here), but there's nothing to stop you from cascading Producer/Consumer loops. You can have your DAQ Producer sending at 1Khz to an intermediate Sample-processing loop that accumulate data-to-be-saved, then enqueues them to a Writing Consumer. A bit more complicated, but then so is your task ...
BS
06-23-2016 09:58 PM
06-24-2016 11:35 AM
Hi,
I tried implementing the idea of using the DAQ read loop (and hence the PCie device timer) to time the other loops. I am running into the problem where if I set the sample to be acquired as 15 with 15k Htz sampling rate (1ms ticks), sometimes, the scan backlog would exceed 15 ( around 40 or 50). Doesn't this mean that for a few clicks, the while loops were too slow? Also, with 1ms clicks and the datalogging placed in the consumer loop (using a queue structure), the updating of the waveform chart is way too slow.
So I then tried setting F to be 150 (so 10 ms ticks now). For the most part, scan backlog doesn't exceed 150, but there was indeed one instance when the backlog went up to about 200 when I kept the DAQ running and simultaneously opened up another LabView VI.
Also, what are some good ways to check to see if I am reaching the desired level of timing accuracy?
Thanks again!