From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Sync Daqmx AO with AI

Solved!
Go to solution

Hello Community,

 

I'm generating a control signal and I would like to write this signal to an AO while simultaneously read the response on an AI. All the AI and the AO channels are on the same PXI board. One of the AI channels is a loopback from the control signal, so I can verify if everything is properly written. 

 

If my AIs and the AO are truly in sync then I would expect that the loopback signal within my acquired signals look always (very-very-very much) the same. Instead of this the AO slightly changes its position and this could be as high as about 1-2ms, so I guess the AIs and the AO is not properly synchronized.

 

I'm pretty sure I have just messed up something in my code, would you mind checking it and letting me know whats wrong?

Thanks.

 

 

 

 

daq.png

0 Kudos
Message 1 of 5
(3,947 Views)
Solution
Accepted by 1984

Yours is a common case where only 2 things are needed to guarantee hardware sync for AO and AI.

 

1.  Designate one task as the "master" for timing purposes.  Configure DAQmx Timing for the "slave" task to use the master task's sample clock.

 

2. Enforce the run-time sequencing of the calls to DAQmx Start to make sure the master starts *last*.

 

You already have sequencing in place that makes AI starts last.  So let's call AI the master.  All that's left is to wire a different terminal name into the 'source' input of the AO task's call to DAQmx Timing.  It'll look something like "/Dev1/ai/SampleClock".

 

My usual preference would be a little different than this, but try to do the minimal change above before tackling one of the mods below.

 

I like to try to offset the AI sample clock to be just a little after the AO clock, somewhere between 50-95% of a sample interval.   One way to do that is to set the AO task as master and force it to start last while configuring AI as the slave which uses "/Dev1/ao/SampleClock" and *also* sets the polarity to the trailing edge of the AO sample clock (probably the falling edge, but you may need to double-check this).

 

Another way that gives more control is to let a third counter pulsetrain task be the master and both AO and AI use its output as their sample clock.  This gives you more flexibility about defining the exact time offset between AO generation and AI samples.

 

 

-Kevin P

 

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 2 of 5
(3,942 Views)

I could swear that I have tried that and didnt do what I have expected. But for sure not, because it just works 🙂

 

Thanks a lot man! This is solved now.

0 Kudos
Message 3 of 5
(3,938 Views)

I think I may be in a similar position. I've built this off of the examples included in LV 2019; following the "master-slave" guidelines that you mentioned in your solution and the "slave" is started before the master. However, my read node times during runtime.

Timed_out.PNG

I'm positive that it has to do with the way I'm triggering (or rather the way I'm not) the AI. Any insights would be appreciated.

Thank you for your time,

     Jacob D

0 Kudos
Message 4 of 5
(3,568 Views)

I have never used the DAQmx Trigger properties you set for SyncType= "Master" or "Slave".  I don't know for sure what they do or what devices support them.  I sure see why you'd think that's what I had been talking about though.  Anyway, my advice is to remove that part of the config, at least for now.  I doubt it's needed and it might be hurting things.

 

When possible, I prefer syncing tasks by sharing a sample clock, but I can't tell from your pic whether you intend to have the same sample rate for your AI and AO tasks.

 

I do see that your AI task uses an analog start trigger.  So samples won't start being collected immediately after the call to DAQmx Start.  They'll start after the trigger condition is met.  Consequently, the AI task should play the role of "Master" here.

 

The AO task (as "Slave") should be started first.  I'd try to have it "borrow" the AI task's sample clock, which won't actually start running until after the analog trigger condition is met.  One way to do that is to explicitly configure DAQmx Timing for the AO task.  The sample clock 'source' input should be something like "/Dev1/ai/SampleClock".

 

Another alternative is to configure the AO for a digital start trigger.  Then configure the trigger source to be something like "/Dev1/ai/StartTrigger".  This is an internal signal that will assert when the AI task's analog trigger condition is met.

 

Do some more searches on the site about AI AO sync.  There's a lot of stuff, some of it better than others, including some useful working examples.  Keep an eye out for the ones I posted or approved of.  Smiley Wink

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 5 of 5
(3,553 Views)