From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Using Notifiers to simultaneously begin Start Vi of DAQ and IMAQ (ms accuracy?)

Solved!
Go to solution

I know there are hardware solutions (RTSI based and trigger based) to synchronize starting image and analog acquisition.

 

For reasons, I cannot use hardware solution. So now I would like to find software solution to start the two acquisitions simulnateously.

And rely on the fact that both camera+framegrabber (PCIe1433) and DAQ cards (PCI 6251) has microsecond (better) accuracy.

So if I can start the two acquisitions simulnaneously, I can assume that with millisecond accuracy (limited by windows OS) I will have the images (100Hz) and DAQ signal (50kHz) synchronized till the end of acquistion.

 

(A) Am I right ?

 

(B) Now the part 2 of my question is whats the best method to software-trigger the two acquisitions ?

 

1. Use the Merge Error Vi and send error wire to both Start (IMAQ and DAQ) VIs.

 

2. Use Notifiers. Here I generate a notifier and wire "wait on notification" to both Start VIs ?

 

Which is better ?

 

(C) Is there a better way than this ?

 

Any and all help appreciated.

 

thanks

gvstemp

 

PS: Only reason I am posting in 2 forum sections is to get wider audience.

Thanks.

 

0 Kudos
Message 1 of 9
(3,082 Views)

A: No.

 

The only way to 'really' synchronize a data-acquisition and an image-acquisition is via hardware.

 

 

 

Christian

0 Kudos
Message 2 of 9
(3,045 Views)

To build on Christian's answer, just synchronizing the start of those calls will not give you any determinism for when the functions complete.  The VIs take different amounts of time and that time will vary based on when the OS decides to run them.  You'd be lucky to get ±10ms between the two, and you wouldn't be able to tell which was ahead.  Additionally, since they won't have shared clocks they'll drift relative to each other.

 

You really need some manner of shared start trigger at minimum.

Seth B.
Principal Test Engineer | National Instruments
Certified LabVIEW Architect
Certified TestStand Architect
Message 3 of 9
(3,041 Views)

Interesting !

 

I thought that inherently the DAQ and IMAQ cards have MHz accuracy. The "Read VIs" (forr both IMAQ and DAQ) just read the buffers that is filled in by the acquisition VIs. So as long as you collect ALL the buffered data (without skipping frames and without writing over the DAQ data buffers), I can safely assume that the deltaT between analog signal data points is exactly 1/acquisition rate

and deltaT between image frames is exactly 1/framerate.

 

If that is true then deltaT of both images and DAQ is irrespective of how "differently (slow/fast)" I "read" the buffers (imagedaq buffers).

 

Now the problem should lie only in the synchronisation of START Vis.

And I was kinda sure (with all my reading on Notifers) that with notfiers I can start 2 processes with in the Operating system accuracy (+/- 1ms).

 

What am I missing ?

 

Any and every help appreciated.

 

and thanks for taking time to reply.

0 Kudos
Message 4 of 9
(3,013 Views)

The difference is that the code that starts the processes is different between the two drivers and will taken an unequal (and indeterminant) amount of time from the beginning of the function call to start the device and the completion of that call (when the device actually starts).  This difference is on the order of several milliseconds and will fluctuate with each run based on what the system is doing at the time.

 

Essentially, even with notifiers, you will have a several millisecond offset between the IMAQ frames and the DAQ samples, and there won't be a deterministic way to determine which is ahead or behind.  You can reduce the amount of time it takes to start the NI-DAQmx task by using Control Task to commit the task before starting, though I don't know if there is a similar function for IMAQ.

Seth B.
Principal Test Engineer | National Instruments
Certified LabVIEW Architect
Certified TestStand Architect
0 Kudos
Message 5 of 9
(3,010 Views)

gvstemp,

 

Accuracy and frequency are two very different things.  Your term "MHz accuracy" is meaningless.  The specified timebase stability of the PCI-6251 is 50 ppm.  The PCIe1433 does not appear to have a specification for timebase accuracy.  

 

Let's look at a simple example.  Suppose you have two separate PCI-6251s running acquisitions at 1 MS/s.  Let one of them be 50 ppm fast and the other 50 ppm slow.  After one second the fast one will have taken 1,000,050 samples while the slow one only got 999,950 samples.  So even if you started both at exactly the same time (via a common trigger for example), the sampling is NOT synchronized after the first samples.

 

If you want truly synchronized sampling, you must use a common clock.

 

Lynn

0 Kudos
Message 6 of 9
(3,006 Views)

Again very informative and thanks both Seth and Johnsold for the answers.

I am not trying to be a pest but trying to understand this better, so please bear with me in this "what-if" scenario.

 

I agree with Seth that 2 Start task VIs cannot be synchronized (there can be a in-deterministic tens of ms error) and I also understand that 50ppm would mean that even if I was running only 1 PCI6251, I would be acquiring +/-50 points error every second (at 1MS/s).

 

Now what if:

I set the DAQ task and use the "DAQmx Control Task.Vi" to "commit" the hardware resources on DAQ.Now I start the IMAQ first and let it go on taking images. After some time (say a few seonds) I do 2 things:

 

1. Read the current "framecount"  using "IMAQ property node"

2. send a notifer to DAQ Start VI.

 

I think, (and most probably Iam wrong here) that in this scenario, first point of DAQ acquistion are within a millisecond offset to the Nth frame (as given by framecount).

And from then on, I record images at 100 Hz (frames per second) and DAQ at 10kS/s (not MS/s).

My total acquisition time is 50 seconds.

 

You think I still cannot assume that the Images and data points are synchronized ?

Actually after thinking a little more I think there will STILL be an indeterministic offset due to DAQ hardware starting (even after commiting resources), but that will be an offset and NOT a continuous jitter of misalignment of timestamps.

 

In my application I can live with offset (even if its in-determinate (is that a word?)) but not with a time varying jitter.

 

Am I totally off here ?

 

thanks again for your time.

 

 

 

 

 

 

 

 

 

 

0 Kudos
Message 7 of 9
(2,999 Views)
Solution
Accepted by topic author gvstemp

The offset will remain throughout as you have thought.

 

At 100 frames per second and 10000 DAQ samples per second you will probably not be able to notice jitter.  However, you may see some frames which have 101 or 99 DAQ samples rather than 100.  This is not jitter but simply due to the fact that the two clocks are independent.  One approach which would allow synchronization at analysis is to record a frame sync pulse on an additional DAQ channel.  If the camera (or the frame grabber) puts out some kind of once per frame signal (like the vertical sync pulse of analog video), acquire that on a DAQ channel. Then when analyzing the data you can start the analysis of each frame at edge of the sync pulse.  That will always by synchronized with the other DAQ data to within 0.1 ms for your 10 kS/s acquisition.

 

Lynn

0 Kudos
Message 8 of 9
(2,994 Views)

Thanks a ton for the input Lynn.

gvstemp

0 Kudos
Message 9 of 9
(2,992 Views)