04-24-2016 08:25 PM
I'm running LabVIEW 2013 on a Windows 7 PC, and I'm using a PCI-6251 DAQ board with a BNC-2110.
For my application, I need to generate TTL triggers (for example, with a frequency of 1Hz). At the same time, I need to run an AI data acquisition that is triggered by the same TTL signal.
So far, I have managed to set up the TTL square wave signal correctly with a counter output task - I can see the triggers on a scope. I have also created an AI input task, and included a digital trigger. It's partly working, and I have a few questions:
Thank you very much for your help.
Solved! Go to Solution.
04-25-2016 08:27 AM
Can you describe the A/D task and how it relates to the TTL Trigger? In particular, are you trying to collect analog samples continuously or in separated bursts? If your sampling rate was, say, 1KHz, do you want to take 1000 samples every second or only 100? In the former case, you would be continuously sampling, since when you took Sample 1000, the next trigger would arrive and you'd take Sample 1 of the second set, whereas the second example has 0.1 seconds of sampling followed by 0.9 "idle" seconds.
If, in fact, your goal is continuous sampling synchronized to your 1 Hz TTL Triggers, I would strongly urge you to "invert the logic". You don't say how you are generating the 1 Hz TTL triggers, but if you are not using a Hardware Clock, and are depending on LabVIEW's Timing VIs to generate the pulses, the clock on the 6251 is much more accurate. If you set it to generate 1000 points at 1KHz, you can expect it to send you the points at a more precise interval than LabVIEW's clock can provide.
[Hmm -- that's a pretty strong claim. I always advise people to "play Scientist, and Do the Experiment", but realize I haven't done this, myself. I'm going to post this, now, then take my own advice, and report back with my results ...].
Bob Schor
04-25-2016 09:14 AM
Well, that's embarassing. I just ran the following code, with a USB-6009 set for 1000 Samples at 1KHz as my DAQ device and using LabVIEW's High Resolution Timer (which reads, I believe, the CPU's System Clock) for timing info. I expected the second pass that used LabVIEW's Wait function running off Window's millisecond timer (which has to contend with whatever Windows might be doing to "steal cycles" to do less well. Here the code:
and here are the results (expressed as mean ± S.D.): USB-6009, 0.999916 ± 0.000887; Windows/LabVIEW, 0.999940 ± 0.000240. To my great surprise, (with a single test), LabVIEW's clock "won" (was closer to 1.00000 and had a smaller S.D.).
Well, what do you do if the Experiment Proves You Wrong? Why, you do it again. Just a second (or rather, just 200 seconds). Nope, same general finding, 0.999921 ± 0.000818 (USB) vs 0.999948 ± 0.000243. I'll note that my PC was relatively "idle" during this time, so the CPU wasn't doing too much multitasking (as might be the case if this was embedded in a big LabVIEW program), but still, I did expect Hardware to win ...
Bob "Often Wrong, but Never in Doubt" Schor
04-25-2016 09:56 AM
Ok, I think I've got it (partly) figured out. I'm attaching a screenshot of a VI that does what I need.
The alternating rising and falling edge does not happen anymore - the waveform on the graph (I wired the TTL output to the AI for testing purposes) does no flip sign.
The only thing I don't like right now: I'm specifying separately ctr0 for the TTL output and PFI12 for the trigger input, even though they are the same terminal on the BNC-2110. This is kind of error-prone, it would be good to tie both to the same constant/control. But I haven't figured that out yet.
It terms of timing, I've looked at the periodicity of the TTL signal on a scope, it doesn't jitter and seems to be good enough for my purposes.
Thanks a lot for your reply!
04-25-2016 03:44 PM
hmalissa:
You can query programmatically for the pulse terminal of a counter output task using a DAQmx Channel property node. Here's a snippet. Just save the image & drag the file into a LabVIEW block diagrom and it turns into code. You don't have to use this as a subvi, the controls & indicators are just there to identify which task is which.
Bob: interesting experiment. But to help future readers from drawing the wrong conclusion, I'd just emphasize that a hardware timed task still *does* produce much more repeatable sample timing than a software timed task in a software timed loop. Bob's example isn't addressing the regularity of the individual sample timing intervals, just comparing whether the DAQmx driver or Windows msec timer is more responsive (and repeatable) at marking the end of a 1000 msec interval.
-Kevin P
04-25-2016 04:23 PM
Kevin,
Thanks for the comment. I am a fan of "testing my assumptions", but might not always draw the best conclusions. I, too, would always favor hardware timing -- just because the software timer "seemed better" in this situation, when the machine is busy doing lots of things, I wouldn't want to trust it as much as the little crystal ticking in the DAQ device with nothing to do but the one task of "Give him the data NOW".
Bob Schor
04-26-2016 12:17 PM
Yeah, the only little nitpick I'd reemphasize, mainly for future readers, is that the DAQmx Read loop *also* has an element of software timing in it. The samples themselves are hardware timed, accurate and precise to within the specs of the board's oscillator. But the software call to DAQmx Read requesting 1000 samples has some non-hardware aspects. DAQmx Read will wait until it becomes aware that 1000 samples are available before returning. I don't know the exact mechanism for that awareness, but there'll be some element of software execution going on.
Put another way, the slight timing variability in the 100 iterations of the For Loops only directly address how much timing variability there is to run the code contained inside each loop. In the DAQmx Read loop, most of the timing is governed by the hardware sample clock and is extremely consistent, but there remains a small and variable contribution from software processing.
All that being said, none of the measured timing variability means that the DAQmx sample timing itself is variable. The sample intervals are likely gonna be consistent to within a few nanoseconds, having absolute accuracy within a few dozen. (Typical specs are 50 ppm accuracy, with most temporal drift caused by temperature change).
-Kevin Price
04-26-2016 07:53 PM
Great, thanks, that does the trick!