LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Real-Time jitter spikes and how to profile

Hello,

 

I am writing a software to control an instrument which requires changing the digital output of a NI DAQ card at a kHz rate. The actual frequency matters and should be met as well as possible.

 

Basically, it is just a timed-loop that runs every X us and calls the  "DAQmx Write.vi" and increases / resets a counter to go to the next element in the value-array (see attached image).

 

Overall, the timed-loop takes 12 us to complete (what I get from the "Iteration Duration" terminal of the loop) but sometimes the duration spikes to 100 us or 200 us. I am trying to run the loop every 50 us which consistently fails after a few seconds. Failing in this case means the required iteration duration was exceeded. The total CPU load on the real-time target is ~20% and most of this time is spent on the timed-loop.

 

So how can I avoid these jitter spikes? Or how can I figure out what causes them? So far, I didn't really find a nice way to debug jitter on a real-time target. Note that there are some other VIs running on the target as well which might cause some interference but shouldn't setting the priority take care of that?

 

Thanks!

0 Kudos
Message 1 of 3
(2,690 Views)

What digital output hardware are you using? It might be easier to work around the problem rather than directly fixing it. If your hardware supports digital waveforms, that will be a much more reliable way to change the output at a consistent rate, since it appears that the pattern you want to output is known in advance.

0 Kudos
Message 2 of 3
(2,676 Views)

I am using a PXI-6509 which can not output waveforms. This is why I started using the timed-loop approach in the first place.

0 Kudos
Message 3 of 3
(2,635 Views)