Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

GigE Vsion images sometimes corrupted

Hi, I get very consistent image acquisition using the manufacturer's driver for my Point Grey Flea3 GigE camera.  However, when I use MAX to stream video from the camera things generally work well, but there are times when images are corrupted and this normally corresponds with periods (several seconds) when the Dropped Frames counter increments rapidly.  I've attached an example of one of these images.  What concerns me is that I am getting partial images, if I lost complete images this would be better.  It is normally the bottoms of images that are affected.

 

It is also strange that it will be working fine for several minutes, but then it drifts into this phase where images are corrupted after which it drifts out again after a short period.  

 

Is there anything I could do to prevent this, maybe this is an artifact of grabbing using MAX - possibly I can mitigate this using low-level grabs in LabVIEW?

 

 

0 Kudos
Message 1 of 12
(5,622 Views)

This looks like generic packet loss. There might be something in your system configuration (maybe the CPU is going to sleep occasionally) that is causing it to occasionally drop data. One suggestion I have would be to try lowering the Peak Bandwidth Desired attribute and see if that clears things up. This might not even affect the frame rate if your average rate is already below 1Gbit. Since this is at a low level, switching acquisition modes will likely not change anything. You could of course switch to one of the NI GigE Vision cards supported by our High Performance Driver and this would likely help as well.

 

Also, you can tell the driver to give an error when the frames contain lost packets. This is configured by the Lost Packet Mode attribute. By default it lets you get the frame as normal (although you can query how many missing packets the individual frame has), but you can configure it to return an error on the Get Image call. You could then have your processing loop skip any images that have this particular error.

 

Eric

 

 

0 Kudos
Message 2 of 12
(5,604 Views)

Great thanks Eric, I'll give this a try tomorrow.  

 

In the past, I've always used FireWire or CameraLink and these sort of problems just never happened.  However Ethernet makes the total system so much simpler (long cable lengths, no frame grabbers, no backplane considerations if the cameras are plugged straight into the computer's ethernet ports etc) that I'm going to stick with it for a while.

0 Kudos
Message 3 of 12
(5,601 Views)

Hello,

 

I am having a similar issue with my flea3.

It is connected to the broadcom GigE adapter on my laptop.

Have you found a solution to this?

0 Kudos
Message 4 of 12
(5,550 Views)

@AsianSensation wrote:

Hello,

 

I am having a similar issue with my flea3.

It is connected to the broadcom GigE adapter on my laptop.

Have you found a solution to this?


Did you try lowering the Peak bandwidth Desired attribute to something below 1000Mbit?


Eric

0 Kudos
Message 5 of 12
(5,547 Views)

I obviously need to get myself a better nickname on this forum, Bluecheese and AsianSensation.

 

Eric I can report back that yes it did help when I reduced the peak bandwidth required.  However, this is no guarantee that problems won't appear now and then.  For instance, if the PC processor is reasonably busy, this tends to happen still.  It is very unpredictable, which is part of the problem.

 

My next test will be to use a PC with the correct Intel network chipset running LabVIEW RT thus ensuring that the best driver is used - I am assuming that this elliminate the problem?

 

As an aside: I am still amased how reliable cameralink and firewire is compared to GigE - I fired up one of my old systems - a 2005 system (PC) with 6 Point Grey FireWire cameras all running at 30 frames a second (two firewire cards used) - with absolutely no issues and the CPU is hardly running over 10%, including (simple) image segmentation.  With GigE the CPU is very busy - I still need to benchmark this under RT though.

0 Kudos
Message 6 of 12
(5,542 Views)

Hi,

 

GigE Vision can be extremely reliable, just as other machine vision standards are. Using a network card that can use NI's High Performance driver can help a lot. The combination of a good network chipset along with drivers optimized for GigE Vision both reduces the performance overhead along with reducing the chances data can be lost. In general, if you are missing data under high CPU load this likely points to your current network card/drivers not being able to deal with GigE Vision effectively. I also recommend using a single port per camera if possible.

 

Eric

0 Kudos
Message 7 of 12
(5,527 Views)

Hi,

 

I got a similar problem (occasionally lost packets) with our cameras, too.

 

However, changing Peak Bandwidth doesn't solve anything... I wonder what that setting actually does?

 

Using an Intel Pro 1000 NIC with the NI driver installed does not seem to make it any better.

 

I also wonder which socket receive buffer size the NI GigE Vision driver uses when it does not have access to the high performace driver. Does anyone know that?

 

Tim

 

0 Kudos
Message 8 of 12
(5,518 Views)

Hi! I have the same problem... Any solution? 

0 Kudos
Message 9 of 12
(4,995 Views)

I am now running without any packet loss by:

- carefully choosing a Intel NIC that is supported by the high performance driver and that can also do jumbo packets

- setting the peak bandwidth to a value appropriate for the datarate that my camera is sending data at

- dedicating one CPU core solely to reading ethernet packets (even if it only runs at 5% or 10%) - it helps if you use a Quad i7.

- using LabVIEW RT

 

I am considering moving to USB3 - in the hope that USB3 will take the CPU out of the loop the same as firewire and cameralink used to do - makes things a lot more robust in my experience.

 

0 Kudos
Message 10 of 12
(4,991 Views)