LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQmx and X-Net Waveform Timestamps

Hi all,

 

First, some context:

 

I'm putting the finishing touches on a headless data acquisition system for mixed mode measurements. I'll be acquiring analog signals and counters with DAQmx, CAN bus signals with X-Net, and raw binary streams from an inertial navigation system with NI-VISA. I'm using a cRIO-9045, with a GPS module (SEA 9405, very similar to the NI-9467) interfacing with the FPGA timekeeper. DAQmx, VISA, and X-Net acquisitions are all running in parallel loops with an expanded version of the queued message handler project.

 

I will be updating the system clock at the beginning of each acquisition to match GPS-derived time ported out of the FPGA VI. As I understand it, both X-Net and DAQmx use similar approaches in generating waveform timestamps; they query the system clock when they are first invoked and use their respective onboard clocks to increment the timestamp thereafter. 

 

Now, my question:

I get the gist of how the respective timestamps are generated, but more specifically, how do I "reset" that initial timestamp query? For X-Net, I think that the initial timestamp is created when the session is created, so closing and reopening a session between acquisitions appears necessary. What about DAQmx? Is stopping the task sufficient? Or do I need to clear the task and create it again?

 

My objective is to minimize drift between the acquisitions of different types. Analog input is precisely metered by the sample clock. CAN is asynchronous. My serial data (once the binary stream is parsed) contains its own GPS timestamps from the INS. I reason that, if I update the OS clock from the GPS time whenever I send a software trigger, all of my waveforms and my serial data should have a quite-accurate, shared time signature over the course of a short-duration (~10 minutes) acquisition.

 

0 Kudos
Message 1 of 2
(2,036 Views)

Some further reading has uncovered some potential issues with my plan. Namely, the OS clock has to be updated through the "Set Time" VI. That VI appears to require a system reboot in certain cases (unclear when), and there is no information regarding the determinism of the VI. Can I count on the VI to update the OS clock quickly enough to remain within 10ms of my high-accuracy (FPGA) clock after execution? 

 

More to the point, is there a better way I should be approaching this? It occurred to me that instead of updating the OS clock, maybe I just manually pull a timestamp from an FPGA I/O node at the beginning of my acquisitions (sequence structure with the DAQmx start task or X-Net create session) to replace the t0 attribute of the resulting waveforms. Any thoughts?

0 Kudos
Message 2 of 2
(2,010 Views)