Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

IMAQ triggering to measure pulse light

Hello everyone,

First of all thanks for all the help I got from this forum by reading here 'anonymously'. Unfortunately, this time I have a problem I could not solve by myself so far. In the program I am using a state machine but changed it as can be seen in the snippet to make reading and understanding it easier.

 

I have the camera GL2018R from Sensors Unlimited, the frame grabber NI PCIe-1433 and the NI PCIe-6353 DAQ card. The camera has a 1D single line sensor and can acquire up to 147000 (147 klps) lines per second. The light source generates 1 ns pulses at a rate of 10 kHz (can be adjusted up to 20 kHz). I basically just want to trigger the camera to record one image/spectrum per trigger.  At the moment I am able to trigger, record and read out the spectra but some strange things are happening:

For example, the spectral intensity depends on the trigger (timing, exposure time and width etc), which should not be because the light is only there for 1 ns. It is very unlikely/impossible to integrate more or less of this pulse to change its intensity. The spectrum should either be there when the triggering works or not (when triggered after the light is gone).

 

Recording the signal:

First I recorded and read out the spectra continously but the readout sometimes could not keep up (losing frames) so I am now recording a finite number of spectra where each spectrum is written to a new buffer number (within a ring buffer). Afterwards the buffer is read out and a new acquisition is started when desired, as you can see in the attached snippet.

 

Triggering the acquisition:

A continous output from the DAQ board triggers the light source with 10 kHz, using ctr0. On a second channel a second trigger signal is generated with the same frequency and set to "finite output" using ctr1. The two signals are triggered using ctr3 of the board to make sure they are synchronous. The width and delay can be chosen independently. I checked the output on an oscilloscope and it works just fine and as expected.

 

Problem:

I do not know what the problem is at the moment. Although the triggering works, meaning I get one spectrum for every incoming pulse (and if I do not provide the trigger, no acquisition takes place leading to a timeout error), their intensities depends on the trigger timing. The integration time has a big influence too, which does not follow my expectations. Also, the first acquired spectrum looks normal, the second is just flat with varying intensity and after that there is an alternating intensity. Every other spectrum has a slightly lower intensity.

In the end it looks like it is not triggering correctly somehow.

I read somewhere that one has to be careful not to trigger the camera AND the frame grabber. I do not think I am doing that. I made the settings with MAX and also by sending serial commands while disabling access to the MAX settings - with basically the same result. If the light pulse arrives within the integration time of the camera there should not be a dependence on the exact trigger timing, the integration time and so on...?!

 

If I choose a long integration time and adjust the light pulse so it is in the center of the integration the spectra look all the same, which is good (except for spectra 1 & 2). Still I cannot explain the strange behavior mentioned above and thus do not trust the results

 

Please let me know if you need any additional information.

I would appreciate any help and hints you can give me.

 

Thanks!

0 Kudos
Message 1 of 8
(3,214 Views)

Hi Flumen,

 

In general, you do want to be careful to only trigger your camera or the frame grabber, but not both. Your code shows that you are triggering the frame grabber, so I assume that your camera is set to free-run mode? Does you camera have an external trigger input? You may want to try triggering the camera directly and allow the frame grabber to run in free-run mode (by removing the triggering part of your code). In some cases, this can improve performance when tight synchronization is needed.

 

A few questions:

 

I know you said that you verified that the two counter outputs are in sync, but why are you using 2 separate counters at all rather than just split the one trigger? Is it so that you can offset the triggers by some amount of time?

 

When you say you are setting the integration time, are you referring to the exposure time of the camera?

 

Can you attach some example images so that we can see the behavior you are describing?

 

-Jordan Calvert

 

 

0 Kudos
Message 2 of 8
(3,186 Views)

If you are triggering the frame grabber, that is most likely the problem. If the camera is in free run, you are just getting the last picture the camera took when you trigger the frame grabber.  You need to trigger the camera directly so that the exposure and timing is all correct.

 

Bruce

Bruce Ammons
Ammons Engineering
0 Kudos
Message 3 of 8
(3,182 Views)

Hi Jordan and Bruce,

Thanks for your explanations and suggestions.

 

The triggering of camera or frame grabber is exactly the thing that is not totally clear to me. You might be right that I am trigering both at the moment. My goal was to trigger ONLY the camera.

The camera has an external trigger input, which is part of the camera link cable (CC0). I know how to set the camera trigger mode, which can be done in MAX or by sending serial commands directly in the VI. However, two things are not clear to me:

 

1) I am not sure what happens when I am using the "IMAQ Configure Trigger3.vi". According to your answer that sets up the trigger for the frame grabber. But if I leave it out, how does the frame grabber know that I want to trigger each buffer for ring acquisition to achieve FAST readout?

When not using the trigger VI, is the trigger connected to FG external trigger I/O automatically rerouted to the CC0 line of the camera? How does the FG then know when to grab the spectrum and which buffer to use?

 

2) In the MAX there is another trigger setup on the right side (see attached picture). I am not sure if it belongs to the FG or the camera. I checked and noticed that the value is written to the camera if I change and save it. The file says

   ControlLinesSource {
   UseDefaultSource (No)
   CCSourceLine0 (External, 0)
   CCSourceLine1 (None, 1)   [...and so on like in the MAX screenshot]
 

Answers to your questions:

I know you said that you verified that the two counter outputs are in sync, but why are you using 2 separate counters at all rather than just split the one trigger? Is it so that you can offset the triggers by some amount of time?

    Yes, that is correct. I am actually using three counters. The first one as reference (ctr0). The trigger for the laser and the trigger for the acquisition have each their own counter. I start them on the ctr0 digital edge so they are synchronized. Using three counters I can delay either of the two triggers (was the only way I found because a negative delay was not possible. I am sure there is a better/simpler way to do that.  

When you say you are setting the integration time, are you referring to the exposure time of the camera?

  Yes, that is what I meant.

Can you attach some example images so that we can see the behavior you are describing?

Not right now, unfortunately, but I can do that next time I can test it.

 

Thanks!

0 Kudos
Message 4 of 8
(3,170 Views)

I agree that it seems that you are triggering both the camera and frame grabber. To trigger the camera directly, all you have to do is set the trigger mode in max and route the external signal from line 0 to CC1 (like you have it). You shouldn't also need to configure any triggering in LabVIEW. For example, if you did a normal Grab in LabVIEW, the frame grabber will just wait for lines to come in and your camera will send a single line for every trigger received.

 

To read in an entire image at once in LabVIEW, you should just need to set the height of the image (number of lines to acquire) in MAX. 

 

-Jordan Calvert

0 Kudos
Message 5 of 8
(3,158 Views)

OK thanks. That is the setting I have right now. It works when I set a long exposure time and set it up so that the pulse arrived somewhere in the middle of that.

I have not tried again what happens if I go towards the edges or if I shorten the exposure time - will do that probably today.

 

Since the camera has a single line, I will basically write one spectrum per incoming laser pulse into the ring-buffer. I set up the buffer in LabView but if I do not call the Trigger VI, how does the frame grabber know to trigger each line. Or is this the default setting anyway?

 

 

0 Kudos
Message 6 of 8
(3,154 Views)

If you are using a long exposure time could it be that this is much longer than the pulse separation time? If so you might be measuring more than one pulse with each line capture which might account for the fact that you are seeing variations in intensity with exposure time. The exposure time needs to be less than the time between laser pulses to guarantee that this is not happening.

0 Kudos
Message 7 of 8
(3,151 Views)

Flumen, 

 

If the frame grabber isn't configured using the Configure Trigger VI, then it is just set to acquire any lines that come in. The camera is configured to send one line per trigger, and once the number of lines equals the image height set in MAX, IMAQ returns a full image.

 

-Jordan Calvert

0 Kudos
Message 8 of 8
(3,115 Views)