Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

USB-6009 sample timing

I am trying to use a USB-6009 to determine the response time of a lcd monitor. To do this I have a matlab script that toggles the screen between black and white, I note the time the flip command on the gpu is completed. At the same time I have a photodiode attached to the 6009 analog input, pressed against the screen. I do a continous background sampling of 1khz during the trial, 20 seconds in total, so far all good. But when I try to synchronize the two dataseries, flip-time and sampletime, I get into some trouble. I get an absolute system time from the matlab data acquisition library for the first sample (TriggerTime), but this time seems off. What I have noticed is that it seems to get better if I lower the threshold for calling the callback function that receives my data and stores it.

 

So my first question is how accurate this TriggerTime actually is? Is it set in the computer, and not in the daq itself!? And if this is the case, how is one supposed to sync two timeseries like I am trying to do?

0 Kudos
Message 1 of 7
(4,550 Views)

Hello RuneL,

 

The time is not stored in the DAQ device you will have a time from the CPU when the acquisition starts at DAQControl and start command and the t0 is then set to a system time in the driver and then you have dt which is determined by your sample rate. The time you get by your system could be close to the real sampling time but it's not deterministic as the USB communication is undeterministic, with PCI/PCIe or PXI/PXIe it could be more religeable.

 

If you really want them to be synchronized you should use shared clocks or trigger the analog input on a digital signal when you switch from black to white on your monitor for example. Then you know that the start time you get might be off but it wont matter as you started your finite sample at the time you switch from black to white on your GPU. How exactly you would go about getting this signal I'm not sure.

 

 

Best Regards
Jonas Mäki
Systems Developer
Novator Solutions
0 Kudos
Message 2 of 7
(4,525 Views)

An other interesting question would be how much does it seem to be off? I would think that the communication gives a couple of ms dependant on the CPU load.

Best Regards
Jonas Mäki
Systems Developer
Novator Solutions
0 Kudos
Message 3 of 7
(4,524 Views)

So I read about how the time will be set by the driver when the data is read from the DAQ, and that it will be adjusted by dt and number of samples. But my question then is wether this calculated time somehow corrects for the time it takes to transmit the data across the usb connection? And what is the average offset of the sample time I should expect using a USB daq? 1ms, 10ms, 20ms?

 

As I wrote in my initial post I am using matlab, and communicating with the daq using their data acquisition toolbox, so I dont know how they call/use the DAQmx drivers. But the strange thing is that the length of the delay seems to depend on how often the toolbox reads data from the daq. In my application I sample at 1kHz, and I set up a callback function that gets called when n samples are ready. Now the delay seems to depend on the value of n, smaller values for n gives me a shorter delay, and larger values for n gives a larger delay. For example with n = 100 i get delays on the order of 50-60ms, with n=10 the delay SEEMS to be down to 5-10ms. These values are kind of hard to verify, but I have used a crt screen to get an idea since this kind of screen has very little input lag. Now n=10 seems to be a lower limit for my system, any lower and the overhead of running the callback slows other things down (screen rendering), so going lower is not realy an option.

 

Basically a couple of ms offset is not a big deal, but > 5ms is starting to hurt...

 

 

0 Kudos
Message 4 of 7
(4,519 Views)

The read function doesn't actually have anything todo with the t0 time stored in driver and computer. Why you see different times is just due to the fact that it takes that amount of time to sample the data. For example, sampling speed 1kS/s and you sample 1k samples would give you 1 second wait on the read function.

 

When you send start task command to the DAQmx drive is where the t0 is set. And not exactly when you call it either, it is stored when the DAQ responds that the acquisition did start. The total time for a USB device test I did from the time I called start task to the time of t0 was about 15-20ms, I didn't try to overload my USB buss. Somewhere in there are your time delay. I'm not sure exactly where in that timegap the time is collected and stored and how long the delay from the actual start of acqusition to the driver gets response and stores the value.

 

Exactly how Mathworks use our DAQmx driver I cannot say, I don't know. But they have to do the configuration and start the task and then read it. The Time is still taken in the driver so it shouldn't be a problem.

 

 

Best Regards
Jonas Mäki
Systems Developer
Novator Solutions
0 Kudos
Message 5 of 7
(4,511 Views)

I am not sure if you understood me correctly, the delay I am talking about is not a delay in the read function itself. It is a delay in the timestamp returned as the time the first sample was taken, what you refer to as t0. So the t0 time is not correct, and the length of the delay changes based on the number of samples I talked about. The callback in matlab that I set up is called every time n samples are available from the daq, and in addition to the sampledata the callback also receive the TriggerTime, or t0 as you call it, that is supposed to be the absolute time when the first sample was taken, but this time is off.

 

But without knowing how Matworks calls the DAQmx drivers I understand that it is difficult for you to tell me anything more specific.I have put the question to the mathworks forum, but still no replies....

 

But thx anyway for the info.

 

br

Rune

0 Kudos
Message 6 of 7
(4,506 Views)

Hi again Rune!

 

I found some information about timestamp and the accuracy. It was actually taken in the driver after the read command. You see different times due too the fact you have delays between your start and the read. If that delay is longer than the actual read you will get larger waits. Also checking the samples available and then reading them out is not the best approach to get most accurate value. Call the read function as soon as possible after you have started the task. The read function waits efficiently untill all the samples are read. Use the "DAQRead1ChanNSampWfm" or "DAQReadNChanNSamp1DWfm" function for best result. Configure the task to read #Samples at specified rate before you do your start task.

http://digital.ni.com/public.nsf/allkb/5D42CCB17A70A06686256DBA007C5EEA?OpenDocument

Best Regards
Jonas Mäki
Systems Developer
Novator Solutions
0 Kudos
Message 7 of 7
(4,488 Views)