LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

time stamps not linear

Hi,

I have a DAQ USB-6251 driven by labview.

The time stamps on the text file have backward and forward jumps in them.  See attached for picture.
They increase linearly for about 12000 samples, then jump backwards about a 3 sec, linearly increase for about 1000, then jump forward about 6 seconds.  This cycle then repeats ie increasing nice and linearly for about 12000 samples, jumping back etc.

I suspect there is a buffer somewhere which fills up and then acts strangly when trying to reset.
Any explanations?  Any solutions?


Here's some further info about my setup:

Producer loop:
I generate a continuous analog output signal which I feed back into one of analog input pins which I also continuously acquire.
The generation and acquisition both have sample rates of 4096 samples/sec and 1024 samples are set to acquire/generate per loop.
The recorded input is enqueued.

Consumer loop:
Dequeues data and write to text file.

Regards,
Jamie
Using Labview version 8.0
0 Kudos
Message 1 of 21
(3,639 Views)
Hi Jamie,

This is very strange behavior.  I will be looking into this and hopefully will find an explanation soon.

What version of LabVIEW do you have?

Thanks,
Erik
0 Kudos
Message 2 of 21
(3,583 Views)
Hi Erik,

Thanks for looking into it further for me.
I have labview version 8.
I still don't know what the problem is but some clues from further investigation.

1) My periodic signal appears to jump at some of the points where the time stamps jump, but not elsewhere.  Whether this is due to the generation or the acquisition or both I don't know, but if the time stamps jump this indicates an acquisition problem at least.

2) I doubled the generation window size to 2048, leaving the acquisition window size at 1024.  fs =4096samples/sec for both generation and acquisition.
Other parameters the same.  Time stamps now jump down twice as early ie at 6*1024 ie 6 acquisition windows or 3 generation windows.  But then jumps up again after just 1 1024 window.
(see pic).

3) Then I tried the reverse ie generation window = 1024 samples and acquisition window = 2048 samples.
The problem fixed itself.  Time stamps are now linear.  The recorded generated signal has no jumps in it. (see pic)

Why? I don't know.  I can't avoid the problem in the future if I don't know what DAQ/Labview limitation I crossed.

Jamie
Using Labview version 8.0
Download All
0 Kudos
Message 3 of 21
(3,537 Views)
Hi Jamie,

Thank you for the new information!  From it, I was able to figure out what is going on!

Fundamentally, it is because your generation and acquisition are not synchronized.

Explanation:
If you have the number of samples to generate set to 1024, DAQmx sets up a circular buffer of size 1024 samples.  The acquisition does not start at the same time as the generation, but slightly afterwards.  This is best explained through numbers.  Suppose you generate the numbers 1-2-3-4-5 in sequence and those numbers are transferred over a wire to the analog input.  Now suppose the acquisition starts after the generation, so it begins acquiring at the number 2.  So the acquisition will acquire 2-3-4-5... then it will return to the beginning of the buffer and read 1 which will have a timestamp in the past.

If you increase your acquisition buffer size, then the samples in the next waveform will continue to be acquired seamlessly because there is room in the acquisition buffer.

This is hard to explain through words, please see the attached image and let me know if you have any other questions or if there is anything else that I can help you with.

Thanks!
Erik
Message 4 of 21
(3,496 Views)

I'm very curious about this behavior, but don't really understand the mechanism of what's going on.  How is it that a late-starting acquisition which sees the sequence 2-3-4-5-1 would decide that the "1" sample should get a timestamp earlier than the 2-3-4-5 samples, i.e., "in the past?"

Is this in any way related to or dependent on the use of Waveform datatypes for AO or AI?  If so, then I guess I'm glad I've never made the habit of using them, preferring the raw array-based buffers.

On a side note, this is one little nit I have to pick.  When LabVIEW pushes a new standard (Waveforms vs. raw arrays, DAQmx vs traditional, LV projects vs. not)toward us,  it's usually difficult to find in-depth info about the trade-offs of the new standard.  There's mostly just rosy promotional material touting the advantages, but no simple way to find out what the sacrifices are.  In the case of Waveforms, I've always figured I can get more trustworthy results handling my own timing info (for example, the waveform's t0 resolution seems limited to system clock resolution, i.e., msecs) and more efficient performance moving / crunching data as arrays rather than as clusters.  Is there a clear presentation of both pros and cons anywhere?

-Kevin P.

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 5 of 21
(3,479 Views)

Hi Kevin,

I have been using the waveform data types since LV 6.0 and the t0 time-stamps have been rock-solid.

The exception was when I syncronized multiple boards in a PXI chassis such the primary clocked the other bds. In that case only the primary had valid t0's and the externally clock WF's had null t0's. But this makes sense.

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 6 of 21
(3,462 Views)
Ben,
 
Maybe this is a good chance for me to learn (be convinced of) something then.  I'd try this out for myself on hw, but I've got tests running now.
 
Let's suppose I have both an AI and an AO task, both set up to start off the same sample clock.  However, the AO outputs on the leading edge of a clock while the AI samples on the trailing edge of the clock.  Let's further stipulate that the clock is generated by an on-board counter at 5 kHz with 90% duty cycle.  So the AO update occurs 180 microseconds before the AI.  How do waveforms handle this offset in t0?  Will t0 simply be set to 0 because of the use of an "external" sampling clock?  Or will the two t0 values be equal and non-zero?  Or will they be sometimes equal and sometimes different, depending on the "phase" of the system clock -- either the 1 msec one or the 16 msec one used for system timestamps?
 
Now, concerning triggering:  Does t0 represent the time of the trigger?  Or of the first sample / output *after* the trigger?  Or is it the time you call DAQmx Start prior to receiving a trigger signal?
 
Other concern: even when not *strictly* necessary, I try to make a habit of making code that runs pretty efficiently, unless that puts an undue burden on development / maintenance effort.  My experience with processing large arrays vs. processing clusters containing large arrays has suggested that pure arrays are typically significantly more efficient to manage.  (I'm sure it depends on sizes and kinds of processing too.)  Aren't waveforms essentially cluster-like?
 
Well, enough of the blah, blah, blah.  I really *am* interested.  I know many of the analysis functions prefer (if not require) waveform inputs rather than raw arrays these days, so there are some clear code simplicity advantages to waveforms IF I can be convinced that I'm fully informed of the downsides and gotchas.  (Another example of worry: when integrating a waveform, how does the floating point roundoff accumulate from the 'dt' value?  Will results late in a long array contain more cumulative roundoff error?)
 
-Kevin P.
CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 7 of 21
(3,455 Views)

Excellent Q's Kevin.

I can not answer any of those follow-up questions. I'll see if I can get some help.

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 8 of 21
(3,448 Views)
Hi Eric,

Using your example of generating 1-2-3-4-5 in a circular loop.
I don't expect the initial 1 to be counted because the acquisition never recorded it as it only started after its generation.
Whats more, though not clear from my graph, when the time stamps initially jumped backwards, they jumped to negative time ie below 0.  Surely the acquisition thread starts its timestamp when it starts its acquisition at t=0?


The 625x specs sheet under Analog Input says the Input FIFO size is 4095 samples.
So I thought the DAQ FIFO would continuously acquire the data, fill up the whole buffer and then fill again from the beginning.  Once my desginated 1024 samples are available,
I thought the DAQ may send an interupt message to Labview to say 1024 samples are available, and then try to transfer this window of 1024 samples from the DAQ FIFO buffer through the USB cable to Labview.  Hence I thought it would be a good idea to set this acquisition window size well below 4095 so the USB and Labview had enough time to transfer data out of the FIFO.  I figured if I set the window size to close to 4095 then the FIFO would fill up and write over itself before it has time to transfer data to Labview.

I think you are saying though the DAQmx has the potential to create a buffer of size 4095 samples, but my choice limits this to 1024 samples.  Whats more, data is transferred between the DAQ and labview I guess in smallish, arbitrary chunks at intermittent times whenever possible and not in 1024 sample chunks.  However Labview then assembles the colleted data into 1024 sample blocks.

Is my understanding correct?  I am trying to understand why if I have a DAQ assistant programmed to continuously acquire data do I 1) need to put it in a while loop and 2) need to specify a value called "number of samples".

Regards Jamie
Using Labview version 8.0
0 Kudos
Message 9 of 21
(3,418 Views)
I am also wondering whether it was a good idea to put both my DAQ assistant input and output blocks in the same while loop.
I put them in the same loop because I wanted them to start at the same time and acquire/generate data in synch with each other.  I understand one may start before the other, but if the sample rate is the same will they both work in synch with each other?

I am wondering if I change the parameters (which I want to do) so one has a larger buffer size or one samples faster than the other, whether the faster one will wait for the slower one after each loop or they both know to work independently, continuously repeating their actions at their own rate until the while loop conditions are no longer satisfied.

I guess I'm asking, is it better to keep both the DAQ assistant input and output blocks in the same while loop or place them in their own separate while loops?

Jamie
Using Labview version 8.0
0 Kudos
Message 10 of 21
(3,419 Views)