LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Vision application interruptions

We deployed a LabVIEW Vision Toolkit based system a few years ago. A Windows PC runs the compiled executable version of the program. It captures images from a GigE camera, does some basic processing, displays the video on a standard monitor, and records compressed lossless video files. It has been processing the video without many problems until recently. The application is designed to be multi-threaded, and there are buffers between the various loops. We set a limit on the buffer size of 20 frames. Lately, the buffers have been overflowing on occasion. We have not caught the cause. The symptoms are: the loops all process frame-by-frame in about 1 ms or less for hours with the buffers never accumulating more than 2 or 3 frames, then suddenly the buffer will overflow. We had problems in the past with datasocket links halting the program. Since then, we added some logic to allow the system to ride through datasocket disconnections. We also have noticed that maximizing/minimizing other windows can halt the program temporarily, but that is not what is happening with our problem. We have tried to disable as many Windows processes and anti-virus checks that IT will allow, but that did not help. I'm looking for suggestions of where to look for the cause of the interruptions. I can't tell you exactly what version it is running right now, but I think it was a 2011 version that we originally deployed. Thanks
0 Kudos
Message 1 of 6
(2,692 Views)

Some questions:

  • What is your frame rate, and what is your image size?
  • How "directly" is the camera connected?  Through a network?  Plugged in directly?
  • If network, same subnet?
  • Why such a small buffer size?

We have a behavioral testing application with 24 GigE cameras running at 30 frames/sec, capturing occasional 5-10" AVI videos from a camera (behavior-triggered).  We use (I think) 30-frame buffers for each camera.  We record for 2 hours at a time, and do have stations "die", but it appears to be the fault of other equipment, not the camera.

 

Bob Schor

0 Kudos
Message 2 of 6
(2,658 Views)

Our frame rate is 30 fps and image is approx. 1000x1000 pixels grayscale.  The system is intended to be a live review station of an X-ray inspection.  We need to ensure that the operator does not miss anything.  I am thinking about increasing the buffer size and limiting the display loop to about 30 fps.  I have been afraid that increasing the buffer size will allow frames to build up, and then when resources are available, the system will play the frames faster than you can see them.  Limiting the display frame rate would eliminate that possibility.  It would introduce a lag, however.

0 Kudos
Message 3 of 6
(2,654 Views)

The number of buffers controls how much hidden memory within the Driver is devoted to saving images until you call for them.  They act like a Lossy Queue -- if you have 1000 buffers, you can "look backwards in time" for 1000 frames before data get overwritten.  So if your images are about 1MByte in size, 100 buffers takes "only" 100MB.

 

But this might not be your problem.  What do you mean that "the buffer will overflow"?  How are you processing the images?  I'm a little confused as to the timing, who does what, how the Queues work, etc.  And I am still interested in the two middle questions I asked about how the camera connects to the PC.  If the Camera is spitting out 30 fps and your display is capable of 30 fps (why not?  That's pretty modest), there should be no problem if you don't have a time-consuming step somewhere in the system.

 

Might be instructive to see some VIs ...

 

Bob Schor

0 Kudos
Message 4 of 6
(2,647 Views)

You're right, I forgot to answer all your questions... The camera is connected directly to a dedicated network card.  This link never seems to have any issues.  I get different error messages if the camera is disconnected or the program drops frames.  The errors that are occurring are related to buffers later on in the processing.

 

Since the problem started early this year and ran without this issue for about 4 years, it seems like something external to the LabVIEW program must have changed.  We have been making minor edits occasionally, but they don't seem to coincide with the start of this problem.  When I say that the buffers overflow, it seems that something in the data flow gets stopped downstream and backs up the frames.  If I can believe where the program is telling me that the first overflow occurs, it is just upstream of the display vi.  I'm hoping to go to the machine today and try to come up with some workaround.  I will try to grab some screenshots.

0 Kudos
Message 5 of 6
(2,618 Views)

Could it be that something (like an AntiVirus utility, or an "always-on Backup Utility") is "stealing cycles"?  We've seen AV get pretty aggressive and start consuming 10-30% of the CPU at odd times, most typically when starting up, but occasionally mid-day.  We are also running (but not on critical data-acquisition machines) a "Cloud Backup" utility that looks every 15 minutes and starts to back up "new files" to the Cloud.  If these included data files, this could well be an issue.

 

From your description, it is beginning to sound more like a "Windows/System" problem than one due to LabVIEW.  You might try (at the cost of a few more cycles) running with Task Manager open and its Performance Tab selected,  The Resource Monitor button on that tab gives you additional details.

 

Bob Schor

0 Kudos
Message 6 of 6
(2,609 Views)