From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

IMAQdx drops frames, but LostPacketCount = 0 ?

Solved!
Go to solution

I am using the NI Example "Grab and Attributes Setup.vi" with a XIMEA USB3 camera running at 60 fps, on Windows 7 (i5-4310M, 2.7GHz, 8 GB)

 

"Description: The Grab and Attributes Setup example VI allows the user to view the current attributes and settings, update attribute settings, acquire images continuously, and display the images in an image control."

 

I made a few modifications. I have two image display windows instead of one. In the "<Session Out>: Frame Done" event, when "ActualBufferNumber" is even, the image is displayed on my left image frame and when odd, it appears on the right.  My scene is changing at 60 fps synchronized to a hardware output signal from the camera ("exposure-start"), so the even and odd frame images are significantly different and it is easy to tell them apart.

 

Mostly, it works as expected. However, at random times, usually several minutes apart, the acquisition process apparently skips a frame such that the left and right images are exchanged.  Sometimes it skips a frame, gets one, then immediately skips another frame, so only one output image blinks, and then returns to the original view.   I am displaying the IMAQdx "LostPacketCount" and it is always 0. When the program runs, Windows Task Manager shows LabVIEW using between 14 and 17% CPU, and total system CPU is never more than 41%, so it doesn't seem like the CPU is overloaded.  Also, the moments of frame drops do not correspond to peak CPU use. I am separately monitoring the camera's 60 Hz frame-sync output signal and it is precisely steady (with ~ 50 nsec jitter on the nominal 16.667 msec period) as you would expect from a hardware trigger, there is no stuttering or gaps there.

 

In the input to "Configure Acquisition" I started with 3 buffers, then 6, then 100 but the frequency of frame-drop events does not seem to be sensitive to the number of buffers. Do I have unrealistic expectations of capturing and displaying 60 fps without frame drops? How do I debug this problem?

0 Kudos
Message 1 of 4
(4,731 Views)

Update: I can almost always trigger the problem by switching between mail folders in Microsoft Outlook, while the VI acquisition is running.  So it is apparently related to CPU load, even though Windows Task Manager doesn't show any notable spikes.

0 Kudos
Message 2 of 4
(4,727 Views)
Solution
Accepted by topic author jbeale1

UPDATE:  I tried running another example "Grab and Detect Skipped Buffers.vi" which looks at the "Buffer Number Out" signal from "IMAQdx Get Image2.vi" and it does properly detect skipped frames. The "Skipped Buffer Count" output shows that changing between folders in Outlook during 60 fps acquisition typically drops 5 frames.

0 Kudos
Message 3 of 4
(4,713 Views)
Here's some things you can do to help:

Some of the examples always ask for the "latest" (Next) image rather than the newest you haven't seen (Every). This makes sense for some styles of inspections but not others, and doesn't make use of the larger buffer list (100 buffers, for instance) that you may be using. If you ensure your code is getting sequential images, even if they are old (via explicit buffer numbers or the Every mode), you should not drop buffers unless your spikes are longer than your list or your average processing time doesn't let you catch up.

Since you seem to lose images based on external CPU usage, you may want to try raising the priority of the LabVIEW VI running your acquisition code.
0 Kudos
Message 4 of 4
(4,697 Views)