11-09-2006 12:29 AM
11-10-2006 10:29 AM
11-13-2006 12:26 AM
11-13-2006 06:42 PM
11-14-2006 07:13 AM
I'm very curious about this behavior, but don't really understand the mechanism of what's going on. How is it that a late-starting acquisition which sees the sequence 2-3-4-5-1 would decide that the "1" sample should get a timestamp earlier than the 2-3-4-5 samples, i.e., "in the past?"
Is this in any way related to or dependent on the use of Waveform datatypes for AO or AI? If so, then I guess I'm glad I've never made the habit of using them, preferring the raw array-based buffers.
On a side note, this is one little nit I have to pick. When LabVIEW pushes a new standard (Waveforms vs. raw arrays, DAQmx vs traditional, LV projects vs. not)toward us, it's usually difficult to find in-depth info about the trade-offs of the new standard. There's mostly just rosy promotional material touting the advantages, but no simple way to find out what the sacrifices are. In the case of Waveforms, I've always figured I can get more trustworthy results handling my own timing info (for example, the waveform's t0 resolution seems limited to system clock resolution, i.e., msecs) and more efficient performance moving / crunching data as arrays rather than as clusters. Is there a clear presentation of both pros and cons anywhere?
-Kevin P.
11-14-2006 07:59 AM
Hi Kevin,
I have been using the waveform data types since LV 6.0 and the t0 time-stamps have been rock-solid.
The exception was when I syncronized multiple boards in a PXI chassis such the primary clocked the other bds. In that case only the primary had valid t0's and the externally clock WF's had null t0's. But this makes sense.
Ben
11-14-2006 10:07 AM
11-14-2006 10:28 AM
Excellent Q's Kevin.
I can not answer any of those follow-up questions. I'll see if I can get some help.
Ben
11-14-2006 11:04 PM
11-14-2006 11:13 PM