Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Delay on Do Not Allow Regeneration mode

Hello Jerry,

 

I confess that I didn’t try adding the delay because in your post you were not very convinced that it will work and also, I realize now that I misunderstood delay versus latency in your post.

 

Today, when I added a delay of 0.8 sec, initially (vi running for less than 10 sec) the latency was near 2 sec but after the vi runs more than 60 sec the latency come back to the 8 sec again. I changed the delay for 1.0 sec and I had the bellow error:

Error -200292 occurred at delay non regeneration v2.vi

Possible reason(s):

Some or all of the samples to write could not be written to the buffer yet. More space will free up as samples currently in the buffer are generated.

To wait for more space to become available, use a longer write timeout. To make the space available sooner, increase the sample rate.

Property: RelativeTo

Corresponding Value: Current Write Position

Property: Offset

Corresponding Value: 0

Task Name: _unnamedTask<B866>

 

Well, as my waveform has 1 sec duration, I decided to set a delay slightly lower than that, I set a delay of 0.999 sec. The vi is running for more than 40 minutes and the latency still lower than 2 sec. I reduced the sinusoidal frequency to 2 Hz in order to watch the presence of glitching but so far, no glitching at all.

 

Do you think that is reasonable the idea about the delay must be slightly lower than the “signal duration” or I am misunderstanding this process again? Maybe I should work with DAQmx events like Kevin suggested in his last post.

 

Thank you!

Andrea

0 Kudos
Message 21 of 24
(361 Views)

Hi Kevin,

 

I just answered Jerry, please, check my new findings.

 

Yeah, before I add the delay that Jerry suggested I tried increase the sampling rate according your previous suggestion. Well, considering 1 kHz and 200 samples, these will produce a signal with a duration of 0.2 sec, right? In this case I observed a latency between 1.4-1.5 sec, again, almost 8 times the signal duration. If we consider the ratio latency/signal duration I had the same issue.

 

When I used sampling rate of 1 kHz, 200 samples and add a delay of 0.199 sec (Jerry’s suggestion). The latency is not noticeable, I can't even measure it because it's very small. So far, no glitching at all.

 

As I asked Jerry, do you think that is reasonable the idea about the delay must be slightly lower than the “signal duration”? Maybe I should work with DAQmx events, answering your question, I've never experimenting them before, it is new for me.

 

Thank you!

Andrea

0 Kudos
Message 22 of 24
(360 Views)

As long as there is data being written to the AO channel in time, your application should not show any glitch. If there is no data being written, the DAQmx task should throw an underflow error. I believe this is a reasonable workaround but not ironclad. Just beware that if you have other VI or other application running in your environment, you might have to adjust the delay number. And a tip on programming practice, delay should be insert using a sequence diagram as a part of the data flow. If you just drop a delay onto the diagram not connected to the data flow, when does the delay get executed is kind of random.

 

You can probably build a similar workaround with DAQmx every N sample event. The caveat is that for output on USB, every N sample event will fire off when the samples has been transferred from the software buffer onto the USB buffer on the device instead of when the sample has been actually generated on the analog front end. So you have to add some delay to account for the data propagating from the USB buffer to the analog front end.

0 Kudos
Message 23 of 24
(341 Views)

How long will this control loop app need to run?   Thanks to Jerry_X's helpful description that DAQmx events would fire when data moves from the task buffer to the USB transfer buffer, it doesn't sound like it'll be helpful to use them.

 

And so I'm pretty sure you'll be stuck with an imperfect solution, you'll just have to use one of the ones that's less imperfect than others.

 

In my opinion, a method based on a fixed software delay (such as 0.999 seconds) isn't as good as a method that gets "triggered" at a fixed software *interval*.  The fixed delay depends on an assumption that the rest of the code (other than the delay) will keep requiring a fixed amount of execution time, namely 0.001 seconds.  

 

The simplest version is to use the "Wait until next msec multiple" timing function with a 1000 msec input value.  Once every second (as the application tracks time), you write a second worth of samples (as the device tracks time via its sample clock) to the task buffer.

    This remains imperfect as the PC software time and the device sampling time will disagree by some tiny fraction of a %.   But over the very very long run, this tiny % error eventually becomes significant enough to have an effect.

 

If I were doing this, I'd:

- use a faster sample rate as one part of the strategy for reducing the absolute latency time (even if the relative latency is still 8x the signal duration)

- use the "msec multiple" method I described to "trigger" my calls to DAQmx Write at a regular and periodic rate.  (Note that the correct msec multiple will depend on duration of the signal being written, which will depend on the task's sample rate).

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 24 of 24
(309 Views)