Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

Problem with external trigger on GigE camera

I will look into this for you.
Jeff | LabVIEW Software Engineer
0 Kudos
Message 11 of 21
(3,220 Views)
Thanks Jeff.  I am still looking into this and one thing I am not sure is about the "ExposureMode" setting.  In MAX it only has the "Timed" option, but from the Prosilica website: http://www.prosilica.com/support/gige/ge_controls.pdf on page 5 it says: 

 

ExposureMode

Manual - The camera exposure time is fixed by ExposureValue parameter.

Auto - The exposure time will vary continuously according to the scene illumination. The Auto

exposure function operates according to the Auto and DSP controls

AutoOnce - The exposure will be set once according to the scene illumination and then remain

at that setting even when the scene illumination changes. The AutoOnce exposure function

operates according to the Auto and DSP controls

External - When ExposureMode is set to External the exposure time will be controlled by an

external signal appearing on SyncIn1 or SyncIn2. In order for this feature to work, the

parameter FrameStartTriggerMode must be set to SyncIn1 or SyncIn2. This feature is

supported in version 1.36 firmware and above. It is not available on any of the CMOS-based

cameras.

 

I see this setting along with these specific options in the GigE Viewer program from Prosilica, but not with MAX or Labview?  In MAX it says "Timed" for the option and nothing else.  It sounds like it should have the "External" setting.  I don't know if this is a clue, but I wanted to share it with you.  Thanks again.

 

Kevin

 
0 Kudos
Message 12 of 21
(3,216 Views)

Hello again Jeff.  It appears that my last post elluded to my problem.  I have been trying to contact Allied Vision Technoligies engineers in Germany too, and below is what they said.  The messege is in its entirety and long, but I wanted to include it all to get your opionion on if I can I solve this problem.  I am also not sure if this is a problem with the camera itself or IMAQ-dx?  It seems that it is problem with the xml file.  I suspect that this has been the main problem all the long, and I have been chasing the white rabbit everywhere else to no avail.  Thanks again for your help, and with any suggestions on how I might be able to get this to work.

 

Kevin

 

 

 

 

"they way you want to control the integration time is not working. I checked
the VIs, I am unfortunately not able to run them, but the configuration with
the Acquisition controls will not work.

Usually the way would be pretty easy to control the integration time
externally with a high-level signal:
-set CameraAttributes::AcquisitionControl::TriggerSource to Line2
-set CameraAttributes::FeatureControl::ExposureMode to External
You can test this, using Prosilica the GigE viewer application.

Unfortunately seems the ExposureMode External not to be a part of the XML,
which is the NI driver reading out. This ends up in being not accessible. So
at the moment, the external level-controlled trigger is not working.
I have checked to find a way, to overcome this, but it does not seem that we
can get it to work. If I e.g. set the feature in the GigE viewer and save it
as ConfigFile, the NI driver is quitting with an error message.
We will check if it can be solved via firmware update."

0 Kudos
Message 13 of 21
(3,193 Views)

Hey Kevin,

 

If the only option that you see in MAX is 'Timed,' it is possible that there is something fishy with the <xml> file, or at least how it is being read by the IMAQdx driver. Especially if without changing anything else, you can see all these options in a third party software. It sounds like the Allied Vision support suggested trying a firmware update. I would also suggest this. This is always a good thing to do to make sure that you have the latest versions, latest bug fixes, etc. So, if there is a firmware update for the camera available, I would suggest trying that. 

 

As a possible work around, if the "Timed" option is the only option available, you could programmatically set the exposure time. Instead of relying on the high time of your signal, you would just have to set the appropriate exposure time in your code through IMAQdx property nodes. 

Hope this helps.
-Ben

WaterlooLabs
0 Kudos
Message 14 of 21
(3,189 Views)

Thanks Ben.  Unfortuantly, I don't think the "workaround" solution will work for my application.  From previous experience I don't believe that the fixed exposure time with the external trigger will work.  The main problem I found before is that when I need to change the exposure time after I send the command I am unsure when the change will take effect in the camera. Therefore, I really would like to get this external trigger with the  "high-level" signal working.  I guess I will just wait to see what Allied Vision says to see if they can upgrade the firmware so it will work with NI products, or as you say it is a problem with how IMAQdx driver is reading the XML file...  I am still unsure where the problem is coming from.  Thanks again for everyone's help.

 

Kevin 

0 Kudos
Message 15 of 21
(3,187 Views)

Kevin,

 

The issue is likely because AVT/Prosilica's traditional GigE Vision software does not use the XML file but rather had its own definition of the camera that did not use the GenICam standard. Thus, it is possible that a camera feature could exist in their software but be inaccessible to generic GigE Vision-compliant software if it doesn't expose it in its standard XML file.

 

Your options would be:

-Ask if they can fix this in a firmware update

-Ask if they can tell you how to "patch" the camera's XML file to add the feature 

-Ask for register specifications for the feature in question and use raw register read/writes to the camera to configure it

 

Eric 

0 Kudos
Message 16 of 21
(3,181 Views)

Thanks Eric.  They gave me the register specifications, so I am just using the write register VI until they add XML access after the next firmware update.  It appears to be actually triggering for the specified duration of the level high now. 

 

However, I am still having some difficulties understanding some of what is happening during my test.  Currently I am writing a simple test in LabView to ensure that I completely understand what is happening before I attempt to develop my final product.  The simple test just generates a sequence  of trigger pulses (currently 13) that are high for a predetermined amount of time.  The trigger pulses are using a for loop inside Labview software, so it is not a hardware timed pulse train but the duration of a single pulse is set using hardware so it should be fairly accurate and repeatable.   After each trigger level high pulse I use Labview's event structure architecture to ensure the frame is done, and then use the get image VI to read out the image.  Below are some questions that I have.  Thanks again for everybody's assistance.    

 

 


Should I use the WriteRegister right after the StartAcquision.vi, or right before the StartAcquision.VI?  I have always been a little confused on exactly when I should be changing the camera parameters, the start acquistion VI, the configure acquistion VI, and the register for events for the frame done event.  Can someone give me any information on the right order I should call the VIs?

 

I don't understand why I am getting significantly different times when reading out the images after the level high trigger is completed?  I am using a labview ms timer before and after reading out the image to determine how long it takes to read out each image.  It is my understanding that it should take about 8 ms everytime (because camera's acquisition frame rate is 120fps) to read each image  at full resolution, so why is it sometimes ~100ms and others it is ~1ms?   It seems that the first image read out time is always about 8ms when my trigger pulse duration is much longer than 8ms (e.g. 100ms), but the remaining read out times are only a few ms (e.g. 1,2,3ms), but can be as large as 100ms too?  Although when I set the trigger pulse less than 8ms it seems that on average the readout time is about 8ms, but sometimes it too can only read a few ms instead of about 8ms?  My main concern is that when I am reading very short times (e.g. ~1ms) I am not actually reading the last image, but rather the one that was acquired before since it is much shorter than the camera's acquisition frame rate.  Also my other concern is that when my exposure time is longer than 8ms  (e.g. 20ms) that my camera is not exposing the image for the entire time the trigger pulse is high, since the readout time is not even close to the camera's frame rate (e.g. 1,2,3 ms instead of 8ms)?

 

Sometimes when I am using shorter trigger pulses (e.g. 1,2,3ms)  I even get a timeout?  The only way I can understand why I would get a timeout is if for some reason the camera isn't actually seeing the trigger pulse, but it doesn’t seem to happen consistently.  I am using a LabView event structure immediately after the trigger level high pulse is completed, and it gives a timeout if it takes longer than a 1 second for the framedone event to happen.   Otherwise if the framedone event happens before the timeout I use the Get Image VI to read out the image that was just acquired.  I was wondering if you tell me if I am using it incorrectly, or if there is some other reason why this might happen? 

 

Do you think it would save me any time to use the stream hold parameters until after I am done acquiring my images?  Or at the very least if I use this camera parameter I will ensure that the time between each image is about 8ms, because it won't depend on what my windows OS is doing at the exact time I request to read the image?  Then after I am done with my sequence of images I can read them out indivdually with the get image VI.     

 

Anyone have any idea how long does it takes to update the camera gain?  Before I always took a few images or waited some time before taking the next image to ensure that the gain settings were changed on the camera. 

0 Kudos
Message 17 of 21
(3,154 Views)

kbaker wrote: 


Should I use the WriteRegister right after the StartAcquision.vi, or right before the StartAcquision.VI?  I have always been a little confused on exactly when I should be changing the camera parameters, the start acquistion VI, the configure acquistion VI, and the register for events for the frame done event.  Can someone give me any information on the right order I should call the VIs?


"Configure Acquisition" sets up our driver and the camera to be ready to acquire. It is at this point internal buffers are allocated and we determine how we will interpret the image coming from the camera (pixel format, etc). At this point any features of the driver or camera that would invalidate any assumptions about the acquisition should not be written to. The driver will ensure you cannot modify these settings through the attribute mechanisms, but if you write directly to the registers it would be up to you. This is essentially equivalent to "arming" the acquisition.

 

"Start Acquisition" really just hits the "start bit" on the camera and is designed to be as fast as possible. It is at this point the camera is free to start sending images at any time. You can start/stop the acquisition multiple times without re-configuring if you do not change any configuration parameters as described above.

 

Where you do your register writes would depend on what type of feature it is. If it is some sort of trigger configuration you likely want it before Start because you don't want the camera to be using a different trigger setting when we start it. Since it is unlikely that this trigger setting would be affected or be locked down by Configure Acquisition, it can likely be done at any point after opening the camera, although you likely want to be careful with how you order it with respect to attribute operations that touch similar registers because all the feature and register invalidation that the XML file uses to describe how modifying one feature may affect another one will not be taken into account. 

 

Also, I would be hesitant to recommend using the Frame Done event if you are immediately going to then do a Get Image on that buffer number. LabVIEW's event mechanism has various limitations that usually don't mix well for most imaging applications. If you are waiting for an image, it is usually much better to simply block in the Get Image VI. 

 


kbaker wrote: 



I don't understand why I am getting significantly different times when reading out the images after the level high trigger is completed? I am using a labview ms timer before and after reading out the image to determine how long it takes to read out each image. It is my understanding that it should take about 8 ms everytime (because camera's acquisition frame rate is 120fps) to read each image at full resolution, so why is it sometimes ~100ms and others it is ~1ms? It seems that the first image read out time is always about 8ms when my trigger pulse duration is much longer than 8ms (e.g. 100ms), but the remaining read out times are only a few ms (e.g. 1,2,3ms), but can be as large as 100ms too? Although when I set the trigger pulse less than 8ms it seems that on average the readout time is about 8ms, but sometimes it too can only read a few ms instead of about 8ms? My main concern is that when I am reading very short times (e.g. ~1ms) I am not actually reading the last image, but rather the one that was acquired before since it is much shorter than the camera's acquisition frame rate. Also my other concern is that when my exposure time is longer than 8ms (e.g. 20ms) that my camera is not exposing the image for the entire time the trigger pulse is high, since the readout time is not even close to the camera's frame rate (e.g. 1,2,3 ms instead of 8ms)?



How are you doing your Get Image call? Are you asking for a specific buffer number or are you asking for the "next" image. Next image means to take the currently last acquired buffer number (acquired, not actually retrieved by the user) and add 1 to it. It is more used for the case where you always want the latest image without repeating images. It is usually not good to combine this with triggered cases where you want to retrieve every image. You can get odd timing effects because depending on when you start waiting for the image, the time you wait may be different depending on whether one has recently arrived or not.

 


kbaker wrote: 



Do you think it would save me any time to use the stream hold parameters until after I am done acquiring my images? Or at the very least if I use this camera parameter I will ensure that the time between each image is about 8ms, because it won't depend on what my windows OS is doing at the exact time I request to read the image? Then after I am done with my sequence of images I can read them out indivdually with the get image VI.



The stream hold feature on the AVT/Prosilica camera is different from buffering it within the driver. You would only want to use this if you didn't have the network bandwidth for the camera (or usually more than one camera) to send their images instantaneously. You probably want to buffer it within the driver. You can either do a continuous acquisition and request buffers by buffer number or you can do a non-continuous (sequence) acquisition that is of a fixed number of images. The transfer is likely not the reason why you are seeing inconsistent frame times, it is likely due to not requesting specific buffer numbers.

It may be good if you could include a simple screenshot of your acquisition structure so we could get a better idea of what your code is currently trying to do.

 


kbaker wrote: 



Anyone have any idea how long does it takes to update the camera gain? Before I always took a few images or waited some time before taking the next image to ensure that the gain settings were changed on the camera.



This is camera-specific. When you write a camera attribute, we will block that call until the camera says it has completed. Some cameras guarantee that any images *acquired* after that point in time will have that new setting. The important thing to recognize is that there is a fair level of buffering between the sensor, camera internals, and the driver and so unless you are controlling the triggering or stopping/starting the camera in between, we have limited visibility as to when that setting takes effect. 

 

Hope these answers help,
Eric 

0 Kudos
Message 18 of 21
(3,151 Views)

Thanks again Eric.  Your response is greatly appreciated.  I believe I finally have all the bugs worked out with the camera itself and now I just need to configure and read out the images correctly using Labview.  My main two concers at this time is 1) Why am I sometimes getting a timeout error?  2)  Why is the measured time to read out each image different, and sometimes it seems much too short compared to what I expect it to be from the known camera's acquistion frame rate?  I should NEVER receive an image faster than within 8ms after end of exposure!  Here are a couple of the items that I wonder might be causing these problems and I was wondering if someone had any insight?  I am attaching a few screenshots of what I am currently doing.  The first is the main program, the 2nd shows how I am initializing the trigger and camera, and the 3rd shows how I generate the level high pulse and afterwards read out the image.  

I am always hesitant when using the Frame Done Event also.  But when I take it out I still get the timeout errors sometimes with short trigger pulses (e.g. 1,2,3ms) and the readout time still seems much too short (e.g. 0,1,2ms)when using longer trigger pulses (e.g. 20ms).  As you said using "Next" for the get image VI is no good and even gives a timeout error, and I am very concerend with using the "Last" setting because I want to ensure that the image I am reading out is from the last time the trigger pulse was high and not an image before that.  This is why I am usin the buffer number along with the frame done event.  What are some of the problems you are aware of using the frame done event?

It looks like the timeout problem is reduced or eliminated if I add a Wait (ms) VI inside the for loop after each image is read out?  Even if I have the trigger pulse high for 1ms and add a 5ms wait after reading out the image it doesn't have any timeout errors and the readout time is measured to be about the expected 8ms.  If I shorten the Wait time to 1ms I start to get the timeout errors again?  I would really not like to add more time after each image, since I would like to complete my test ASAP.  Any idea what is happening, and any suggestions on how I can take the wait VI out and still have everything work without time out errors?  I suspect that once I figure this out I will solve the problems with the timeout error and also why I am measuring image readouts that are not the expected 8ms every time.  

Thanks again for your help.

Kevin 

0 Kudos
Message 19 of 21
(3,128 Views)

Hi Kevin,

 

If I had to guess, I would imagine you might be not getting a 1:1 ratio between your generated pules and the camera triggers. If your pulse and trigger settings were incorrect such that you were getting more than one image per trigger, or potentially just noise or improper termination on the signal, then you would be getting more than one image per trigger and it would change your timings completely since the camera is staying armed between loop iterations and thus any extra images acquired will show up the next loop iteration. I'd try writing some simple verification code to simply configure your camera, then have a loop that lets you send a trigger on demand and another loop that polls the transferred image count. Verify that you get a 1:1 ratio between these counters as expected.

 

Next, I'm still not sure what you are trying to do with the event loop there. You can achieve the same effect by just calling into Get Image with your expected buffer number (which you should know since you are sending the triggers yourself) with whatever timeout you want. The main disadvantage to using LabVIEW events is that a) it adds jitter because IMAQdx threads must signal LabVIEW threads to handle the event in the UI thread b) LabVIEW buffers events so if you don't handle them fast enough they just buffer indefinitely. If you write your own loop to do Get Image with a timeout, you can control your loop's behavior when it gets behind (IMAQdx will help you do this by letting you define a behavior for requesting a buffer that is too old and is no longer in memory).

 

Eric

0 Kudos
Message 20 of 21
(3,116 Views)