08-01-2022 12:38 PM
Hi all,
I am trying to figure out how to time the latency of writing to an AO of a NI 9263 DAQ paired with NI-USB-9162 chassis. That is, I am interested in the time it takes from the moment the write command is executed in the code, up to the moment the start of the waveform begins to play. I am working with 100 kHz clock rate on the DAQ, and the waveforms I am trying to send are digital and rather short (few ~10's of samples, each is either 0 or 1).
First of all, I should say that I have not yet come up with a direct way of measuring this latency, so any suggestions will be appreciated. Anyway, from a theoretical perspective, what would be the lower limit (how many milisec's) for such communication over USB? Can it reach few milisec's from writing to playing? I would also appreciate any general suggestion how to speed this up.
What I did up to now is to try to time simple programs that write to an AO. I have used both Python and C# .NET, and their performances are similar. The following is the Python code I am using:
import nidaqmx
import numpy as np
import time
if __name__ == '__main__':
with nidaqmx.Task() as task:
task.ao_channels.add_ao_voltage_chan('Dev1/ao0')
task.timing.cfg_samp_clk_timing(100e3, samps_per_chan=7)
samples = [0, 1, 0, 1, 0, 1, 0]
t0 = time.time()
task.write(samples, auto_start=True)
t1 = time.time()
task.wait_until_done()
t2 = time.time()
task.stop()
print('Write command took ', np.round(1e3*(t1-t0)), ' msec')
print('Write + wait commands took ', np.round(1e3*(t2-t0)), ' msec')
The results of this code is that the "task.write" command alone takes t1-t0=10 msec, whereas together with "task.wait_until_done" it takes t2-t0=60 msec. The numbers, as I said before, are the same for the C# .NET program. What I am trying to understand here is what exactly I am timing. Obviously t0 is where the timing should begin. As for t1 and t2, neither of them of course depicts the right point in time I am looking for. At t1 Python finished the "task.write" command, but it isn't the time at which the DAQ started playing the waveform (they differ essentially by the latency). On the other hand, t2 is the time at which the Python code is updated by the DAQ that the latter is done, which means it also includes the time it takes the DAQ to communicate this information back to the computer. Therefore, the latency should be somewhere in between these two values (10 msec - 60 msec). Question is: is it closer to 10 msec or to 60 msec? It probably depends on the implementation of the nidaqmx library, which I don't know.
Any help/corrections/information would be appreciated.
Thanks in advance,
Eran
08-01-2022 01:09 PM
In general, the latency is not guaranteed and is typically not measurable on Windows as the software execution is non-deterministic.
Similar topic but the answer would be the same - please refer https://forums.ni.com/t5/LabVIEW/How-Can-I-tell-what-is-the-latency-of-NI-USB-9263-using-python/m-p/...
08-01-2022 01:39 PM - edited 08-01-2022 01:57 PM
Hi,
Thanks for your reply!
While I do acknowledge that this is software, isn't there still something we can say about the timing? That is, assume we can accept variations (of <10 msec? More?) in the starting time of the waveform, and that we can tolerate some let's call it "failed attempts" when Windows locks for few seconds (due to low repetition rate). Could we say something then about the average latency (also measure it if possible)? About this software jitter?
Thanks again,
Eran