07-21-2021 12:21 AM - edited 07-21-2021 12:25 AM
Dear forum users,
I want to do DAQ with TTL trigger with the PCIe-6351 card and BNC-2110 front. The idea is to use a 5 kHz TTL trigger source (connected to the APFI0), and collect single-channel 100 data points (using analog input) with 1 MHz sampling rate. So each collection window is about 0.1 ms, and the DAQ speed should be mainly determined by 5 kHz, which is 0.2 ms per cycle. However, when I ran the vi (see attachments), the time for the while loop is always > 250 ms. Is there something wrong in my vi that makes trigger so slow? Any suggestions would be much appreciated, thanks!
Solved! Go to Solution.
07-21-2021 12:23 AM
The LabView version is 19.0f2
07-21-2021 01:16 AM
Hi hoaminw,
@haominw wrote:
The idea is to use a 5 kHz TTL trigger source (connected to the APFI0)
Why do you use an analog input trigger with a "digital" TTL signal? Why not use a digital input trigger?
@haominw wrote:
However, when I ran the vi (see attachments), the time for the while loop is always > 250 ms. Is there something wrong in my vi that makes trigger so slow?
Starting the task also takes some time…
07-21-2021 10:41 AM - edited 07-21-2021 10:42 AM
I'm pretty sure you're going to find that you need to do this indirectly. Although you *could* configure the finite AI task to be retriggerable, you'd also need to retrieve all your samples from the task buffer in the ~100 microsec after one finite acquisition completes and before the next triggering signal. I wouldn't count on that working out reliably.
A different approach is to configure a counter task to generate a retriggerable finite pulse train that acts as the sample clock for the AI task. The counter can be set to generate 100 pulses at 1 MHz in reaction to each trigger and the AI task can use counter output as its sample clock. (The AI task should be started *before* starting the counter task.)
You could then further declare the AI task to be continuous so you'll be free to retrieve the data at a more leisurely pace. Maybe retrieve 50k samples at a time so your Read loop can iterate 10 times a second? Then you can rearrange them into 500 separate subsets of 100 each if needed.
-Kevin P
07-28-2021 06:14 PM
Hi GerdW,
I actually used an analog input trigger, not strictly TTL. I know starting the task takes some time, but that shouldn't take that long (~250 ms). I also tried self-trigger the DAQ and it works out properly with loop time of only 1~2 ms. So I think this delay is mainly caused by the trigger function.
Thank you for the help!
07-28-2021 06:16 PM
Hello Kevin,
Thanks for your suggestions! I tried something similar and finally it works.