LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Synchronization of encoder and camera

Solved!
Go to solution

Hi,

 

I am Abbishek from Auburn University. I have been using LabVIEW to synchronize encoder acquisition and send a pulse train to the camera. I am using USB-6218 DAQ device for data acquisition. I am using a counter input for encoder acquisition and generating a pulse train through the counter output channel of the DAQ device. The encoder acquisition should be such that the data is acquired at the same frequency as the pulse train is being sent to the camera. However, it seems that in the program I made, there is a delay between the encoder acquisition and pulse train generation.

 

Can somebody please help me out with this?

 

Thanks

 

Abbishek

0 Kudos
Message 1 of 10
(773 Views)

It's pretty tough to troubleshoot the code you didn't post.  Please try again.  🤔  (I for one would appreciate "Save for Previous Version..." back to 2016 or earlier).

 

Very generally, this sounds pretty solvable with the right combo of hardware signal sharing and appropriate dataflow in your programming.  Both are needed to get your tasks properly sync'ed.

 

You probably should be making sure the encoder task starts first before you generate the pulse train that will be shared by both your encoder task and the external camera.  You also need to have the encoder task configured properly for using that pulse train as a sample clock. 

 

Post the code (LV 2016 or earlier, please) & I'm pretty sure I can help.

 

 

-Kevin P

0 Kudos
Message 2 of 10
(736 Views)

Hi Kevin,

 

Sorry that I did not upload the code before. Please find the code attached. It has been saved in LabVIEW 2012 version. I think I made a mistakes with the case structure and the timing which I am unable to figure out.

 

Thank you for your help. Please let me know if you need ant additional information.

 

Abbishek

0 Kudos
Message 3 of 10
(719 Views)
Solution
Accepted by Abbishek94

I made a couple minimal changes, added brief comments about them, and back-saved to LV 2012.  See attached.   Ask questions if you need further explanation.

 

 

-Kevin P

0 Kudos
Message 4 of 10
(703 Views)

Hi Kevin,

 

Thank you very much for the help. With these modifications, the data points on the encoder and the images from the camera are matching. However, there is one problem though. I conducted an experiment where a rod was rotating and I was able to view the rotation through the camera. I recorded the images and the encoder data. The rod was initially at rest and then impulsively started moving. It seemed that encoder started recording the angle about 2-3 data points later than what was being seen from the camera. Can this issue be fixed in the code?

 

Thank You

 

Abbishek

0 Kudos
Message 5 of 10
(644 Views)

The code I posted sequences the tasks so that the encoder task is started *before* generating the pulse train that both the camera and encoder task use for capture / sampling.  So I don't suspect the code to be the problem here.

 

Sometimes such a time delay is due to mechanical backlash or compliance. A sloppy gear mesh or a flexible coupling can produce those symptoms.

 

 

-Kevin P

0 Kudos
Message 6 of 10
(624 Views)

Hi,

 

Is it possible to start the encoder acquisition at the exact time when the rising edge of the pulse is sent to the camera each time?

 

Abbishek

0 Kudos
Message 7 of 10
(622 Views)

The code I posted should already do exactly that.  The intent is that the same pulse train signal is sent to the camera and is also used as a sample clock by the encoder task.  Because the encoder task is started before the pulse train is generated, the first sample should be taken on the first pulse.  I presume the camera is similarly ready to take its first image capture on the first pulse.

 

If that doesn't seem to be happening, describe your observations in detail and post the code you're using.  (Note that it's possible the encoder data won't start at 0 if there is motion present before you run the vi.  The encoder task will be actively running and tracking position for some very brief few millisec before the first sample is taken.)

 

 

-Kevin P

0 Kudos
Message 8 of 10
(607 Views)

Hi,

 

It seems the camera takes the image at the falling edge of the TTL pulse. I am using the same program you uploaded last week. The camera takes an inverted TTL signal and hence takes the image at the falling edge of the pulse. Is it possible to send a inverted TTL pulse and acquire the encoder data at that instant?

 

The other doubt was that if the encoder acquires the data each time there is a rising edge of the pulse?

 

Thank You

 

Abbishek

0 Kudos
Message 9 of 10
(588 Views)

In the encoder task config, the call to DAQmx Timing has an input where you can specify the 'active edge'.  Right-click, create a constant, and set it to "Falling".  Then you'll take encoder samples on the falling edge of the pulse, simultaneous with the camera frame capture.

 

 

-Kevin P

0 Kudos
Message 10 of 10
(575 Views)