LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQmx generating TTL triggers and triggered AI read

Solved!
Go to solution

I'm running LabVIEW 2013 on a Windows 7 PC, and I'm using a PCI-6251 DAQ board with a BNC-2110.

For my application, I need to generate TTL triggers (for example, with a frequency of 1Hz). At the same time, I need to run an AI data acquisition that is triggered by the same TTL signal.

So far, I have managed to set up the TTL square wave signal correctly with a counter output task - I can see the triggers on a scope. I have also created an AI input task, and included a digital trigger. It's partly working, and I have a few questions:

  • The TTL trigger physical channel is set to ctr0. This seems to be associated with the PFI12. I'd rather specify the terminals directly in my program - is that possible? The ctr0 is not directly labeled on the BNC-2110.
  • The AI task is triggered from the PFI0. I'm connecting a cable from PFI12 to PFI0. Is that really necessary? Can the AI task be triggered internally from the same counter? My external hardware needs to be triggered with the same signal as the acquisition. So far, my solution seems to be the only way I can get it to work.
  • The digital trigger for the AI task is configured for 'rising edge' trigger. However, when I run the AI task continuously in a loop, it seems that it's triggered alternatingly from rising and falling edge. I have verified this by connecting the counter output on PFI12 directly to the input channel for the AI task, and I can observe that the square wave signal periodically changes sign. Why is that? This is a problem for my application - I need to be able to trigger always from the same edge.

Thank you very much for your help.

0 Kudos
Message 1 of 8
(4,754 Views)

Can you describe the A/D task and how it relates to the TTL Trigger?  In particular, are you trying to collect analog samples continuously or in separated bursts?  If your sampling rate was, say, 1KHz, do you want to take 1000 samples every second or only 100?  In the former case, you would be continuously sampling, since when you took Sample 1000, the next trigger would arrive and you'd take Sample 1 of the second set, whereas the second example has 0.1 seconds of sampling followed by 0.9 "idle" seconds.

 

If, in fact, your goal is continuous sampling synchronized to your 1 Hz TTL Triggers, I would strongly urge you to "invert the logic".  You don't say how you are generating the 1 Hz TTL triggers, but if you are not using a Hardware Clock, and are depending on LabVIEW's Timing VIs to generate the pulses, the clock on the 6251 is much more accurate.  If you set it to generate 1000 points at 1KHz, you can expect it to send you the points at a more precise interval than LabVIEW's clock can provide.

 

[Hmm -- that's a pretty strong claim.  I always advise people to "play Scientist, and Do the Experiment", but realize I haven't done this, myself.  I'm going to post this, now, then take my own advice, and report back with my results ...].

 

Bob Schor

0 Kudos
Message 2 of 8
(4,723 Views)

Well, that's embarassing.  I just ran the following code, with a USB-6009 set for 1000 Samples at 1KHz as my DAQ device and using LabVIEW's High Resolution Timer (which reads, I believe, the CPU's System Clock) for timing info.  I expected the second pass that used LabVIEW's Wait function running off Window's millisecond timer (which has to contend with whatever Windows might be doing to "steal cycles" to do less well.  Here the code:

Timing Test.png

and here are the results (expressed as mean ± S.D.):  USB-6009, 0.999916 ± 0.000887; Windows/LabVIEW, 0.999940 ± 0.000240.  To my great surprise, (with a single test), LabVIEW's clock "won" (was closer to 1.00000 and had a smaller S.D.).

 

Well, what do you do if the Experiment Proves You Wrong?  Why, you do it again.  Just a second (or rather, just 200 seconds).  Nope, same general finding, 0.999921 ± 0.000818 (USB) vs 0.999948 ± 0.000243.  I'll note that my PC was relatively "idle" during this time, so the CPU wasn't doing too much multitasking (as might be the case if this was embedded in a big LabVIEW program), but still, I did expect Hardware to win ...

 

Bob "Often Wrong, but Never in Doubt" Schor

Message 3 of 8
(4,713 Views)

Ok, I think I've got it (partly) figured out. I'm attaching a screenshot of a VI that does what I need.

The alternating rising and falling edge does not happen anymore - the waveform on the graph (I wired the TTL output to the AI for testing purposes) does no flip sign.

The only thing I don't like right now: I'm specifying separately ctr0 for the TTL output and PFI12 for the trigger input, even though they are the same terminal on the BNC-2110. This is kind of error-prone, it would be good to tie both to the same constant/control. But I haven't figured that out yet.

It terms of timing, I've looked at the periodicity of the TTL signal on a scope, it doesn't jitter and seems to be good enough for my purposes.

Thanks a lot for your reply!

Download All
0 Kudos
Message 4 of 8
(4,703 Views)
Solution
Accepted by topic author hmalissa

hmalissa:

  You can query programmatically for the pulse terminal of a counter output task using a DAQmx Channel property node.  Here's a snippet.  Just save the image & drag the file into a LabVIEW block diagrom and it turns into code.   You don't have to use this as a subvi, the controls & indicators are just there to identify which task is which.

 

ctr output as trigger.png

 

 

Bob: interesting experiment.  But to help future readers from drawing the wrong conclusion, I'd just emphasize that a hardware timed task still *does* produce much more repeatable sample timing than a software timed task in a software timed loop.  Bob's example isn't addressing the regularity of the individual sample timing intervals, just comparing whether the DAQmx driver or Windows msec timer is more responsive (and repeatable) at marking the end of a 1000 msec interval.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 5 of 8
(4,684 Views)

Kevin,

 

     Thanks for the comment.  I am a fan of "testing my assumptions", but might not always draw the best conclusions.  I, too, would always favor hardware timing -- just because the software timer "seemed better" in this situation, when the machine is busy doing lots of things, I wouldn't want to trust it as much as the little crystal ticking in the DAQ device with nothing to do but the one task of "Give him the data NOW".

 

Bob Schor

0 Kudos
Message 6 of 8
(4,674 Views)

 

Yeah, the only little nitpick I'd reemphasize, mainly for future readers, is that the DAQmx Read loop *also* has an element of software timing in it.  The samples themselves are hardware timed, accurate and precise to within the specs of the board's oscillator.  But the software call to DAQmx Read requesting 1000 samples has some non-hardware aspects.  DAQmx Read will wait until it becomes aware that 1000 samples are available before returning.  I don't know the exact mechanism for that awareness, but there'll be some element of software execution going on.

 

Put another way, the slight timing variability in the 100 iterations of the For Loops only directly address how much timing variability there is to run the code contained inside each loop.  In the DAQmx Read loop, most of the timing is governed by the hardware sample clock and is extremely consistent, but there remains a small and variable contribution from software processing.  

 

All that being said, none of the measured timing variability means that the DAQmx sample timing itself is variable.  The sample intervals are likely gonna be consistent to within a few nanoseconds, having absolute accuracy within a few dozen.  (Typical specs are 50 ppm accuracy, with most temporal drift caused by temperature change).

 

 

-Kevin Price

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 7 of 8
(4,651 Views)

Great, thanks, that does the trick!

0 Kudos
Message 8 of 8
(4,634 Views)