LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQ with trigger is very slow with PCIe-6351

Solved!
Go to solution

Dear forum users,

 

I want to do DAQ with TTL trigger with the PCIe-6351 card and BNC-2110 front. The idea is to use a 5 kHz TTL trigger source (connected to the APFI0), and collect single-channel 100 data points (using analog input) with 1 MHz sampling rate. So each collection window is about 0.1 ms, and the DAQ speed should be mainly determined by 5 kHz, which is 0.2 ms per cycle. However, when I ran the vi (see attachments), the time for the while loop is always > 250 ms. Is there something wrong in my vi that makes trigger so slow? Any suggestions would be much appreciated, thanks!

 

 

Download All
0 Kudos
Message 1 of 6
(1,214 Views)

The LabView version is 19.0f2

0 Kudos
Message 2 of 6
(1,213 Views)

Hi hoaminw,

 


@haominw wrote:

The idea is to use a 5 kHz TTL trigger source (connected to the APFI0)


Why do you use an analog input trigger with a "digital" TTL signal? Why not use a digital input trigger?

 


@haominw wrote:

However, when I ran the vi (see attachments), the time for the while loop is always > 250 ms. Is there something wrong in my vi that makes trigger so slow?


Starting the task also takes some time…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 3 of 6
(1,179 Views)
Solution
Accepted by topic author haominw

I'm pretty sure you're going to find that you need to do this indirectly.  Although you *could* configure the finite AI task to be retriggerable, you'd also need to retrieve all your samples from the task buffer in the ~100 microsec after one finite acquisition completes and before the next triggering signal.  I wouldn't count on that working out reliably.

 

A different approach is to configure a counter task to generate a retriggerable finite pulse train that acts as the sample clock for the AI task.  The counter can be set to generate 100 pulses at 1 MHz in reaction to each trigger and the AI task can use counter output as its sample clock.  (The AI task should be started *before* starting the counter task.)

 

You could then further declare the AI task to be continuous so you'll be free to retrieve the data at a more leisurely pace.  Maybe retrieve 50k samples at a time so your Read loop can iterate 10 times a second?  Then you can rearrange them into 500 separate subsets of 100 each if needed.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 4 of 6
(1,156 Views)

Hi GerdW,

 

I actually used an analog input trigger, not strictly TTL. I know starting the task takes some time, but that shouldn't take that long (~250 ms). I also tried self-trigger the DAQ and it works out properly with loop time of only 1~2 ms. So I think this delay is mainly caused by the trigger function. 

 

Thank you for the help!

0 Kudos
Message 5 of 6
(1,007 Views)

Hello Kevin,

 

Thanks for your suggestions! I tried something similar and finally it works.

0 Kudos
Message 6 of 6
(1,004 Views)