What relationship (if any) is there between an XNET timestamp and the Windows OS time? I am using the "NI-XNET API for C" library, not LabVIEW.
I am currently translating the timestamps received from a Session.FramesToSignals call by passing the values through two kernel32 API calls (FileTimeToLocalFileTime and FileTimeToSystemTime), and get timestamps that seems reasonable, but that all depends on the relationship between the two timebases.
I have read "How Accurate is the Timestamp of the Waveform Returned by my NI-DAQmx Device?" (digital.ni.com/public.nsf/allkb/5D42CCB17A70A06686256DBA007C5EEA), but I'm not sure if the same type of relationship applies to XNET.
Solved! Go to Solution.
Regardless of which API is used, XNET will querry the host OS for the current system time when a Interface is started. At that point, the current time is placed into a interface specific timestamp regiseter on the XNET device. The timestamp register will then increment with every tick of the master timebase clock. When a CAN frame is acknowledged, a copy of the timestamp register is attached to the frame and placed into the receive buffer to be read by the XNET API.
Note that there can be significant jitter in the initial query to the host OS, especially on Windows systems. If a high degree of accuracy is required we can use the Interface: Start Trigger Frames to Input Stream property to place a special frame into an input stream when the interface is started. The timestamp attached to the start frame will hold the initial timestamp value that was placed into the timestamp register. If we subtract the start trigger time from the timestamp of a received frame, we remove the jitter from the OS and are left with the absolute ammount of time that has elapsed since T0.
I'm attaching a modified version of the Generic Syncronization shipping example that demonstrates how to calculate the absolute time for reference.
There is a simpler way to get Start Trigger timestamp. Instead of using XNET Read Frame to get timestamp for the Start Trigger Frame, one may use XNET Read State.Time.Start.
Although I was told that values returned are not the same, real values I'm reading are matching exactly.
Unfortunately synchronization is always necessary, which makes common timescale usage quite difficult in more complicated applications (multiple CAN channels, and especially multiple DUTs with multiple CAN channels). All CAN interfaces must be synchronized before interfaces start.
Without synchronization, you may have discrepancies up to 1-2ms range between CAN interfaces timestamps even if those interfaces are on the same card or inside the same PXI-Chassis. This means that you may see response on a one CAN channel before you send stimulus on another.
Without synchronization do not subtract Start Trigger timestamp. Interfaces may be started at a very different time, and your relative time scales (from different Start Triggers) are not going to match. Use frames timestamping "as is" instead. Discrepancies between channels should be in the max 1-2ms range.
Let me know if you see discrepancies higher the 1-2ms.
I have set up a cyclic, single point output session on my xnet hardware which will write a frame every 100ms, and a single point input session to read the times the interface received the frames.
I would however like to calculate how long it took from the time the frame was sent out on the hardware interface until the time is was received by the hardware interface.
Can you explain how i would achieve this using synchronization? Or can I use the XNET Read (State Time Comm).vi as the time the xnet hardware sent the first frame out and use this as my T0?
I would recommend making a new forum post to get more attention to your question. This post is over a year old and will get less visibility.
Best of luck,