07-23-2012 04:11 PM
I'm trying to compare a timestamp obtained on a PXI RT with the Get Date/Time In Seconds function vs. an XNET Timestamp. I convert the XNET timestamp with a VI I found in the XNET Logging Examples, and I sample a DAQ AI at 500Hz and get the Date/Time In Seconds when I measure it above 5V. I'm expecting to see a time of about 500ms between the two events, but I'm seeing about 300ms. So, I'm curious where this 200ms would be coming from and do I need to use the RTSI synchronization stuff? 200ms off is too much and 1ms accuracy would be what I'm looking for.
07-24-2012 12:18 PM
Hi kevin.key,
Have you looked at the Synchronize PXI-CAN with DAQmx Analog Input example? You can find this in the example finder under: Hardware Input and Output>>CAN>>NI-XNET>>Synchronization>>""
If you want to get 1ms accuracy you will need to synchronize the AI start trigger and sample clocks.
As to the delay, I'm not sure I understand, you are seeing 300ms, expect 500ms but want 1ms?
DylanC
07-24-2012 12:30 PM
07-25-2012 09:49 AM
How are you currently getting the timestamp? A comparison to >5 and if true get current date/time? This could be where the delay is coming from. I would try getting the timestamp array from the waveform and finding the element which corresponds to the Y value that is >5.
DylanC
07-25-2012 10:36 AM
07-25-2012 10:49 AM
Hi kevin.key,
The delay of that much wouldn't be coming from the daq, depending on what DAQ device you are using (PCI, USB, PXI) but neither of those would come close to 200ms (http://www.ni.com/white-paper/3509/en). I believe the delay is coming from between the reading of +5V and when you call the Get Date/Time. Try using the waveforms timestamp, there is more information on delay with waveform timestamps in the link below.
http://digital.ni.com/public.nsf/allkb/5D42CCB17A70A06686256DBA007C5EEA
DylanC
07-25-2012 10:57 AM