LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Real-Time Image Processing

Hello NI & LABView Enthusiasts,

 

 

I am trying to create an image processing .VI in which I acquire real-time images from a Camera, using now C920 Logitech Pro, which final objective is to measure the speed at which the recognized object in a bloop detection is moving. Before I get to measuring the speed, i must make sure that the images that are processed are constant during the time the process is taking place and are not seen like "black-out" which i think is because it requires so much time to retrieve the image than the actual delay time written in the block diagram. Therefore, I am for now just testing whether my camera and .VI program can simply recognize the fluid movement in the interface without any discrepancies as not transmitting the processed signal in time to my Front Panel.

 

What I do is this: I retrieve images from Image Acquisition, and then make them a binary image using Vision Assistant. Then, the image out, just for now, will measure the pixels area and because I am trying to calibrate at the same time, the surface area difference from 2D to 3D I added a factor for which the pixel area are multiplied.  *Because i want to measure speed at which the flow is moving on the surface area of an object, a fluid; lets say water flowing around a sphere, and I want to measure correctly the speed at which the water is moving by comparing the surface area (which has been calibrated by a factor from 2D to 3D) covered by image in time1 vs image in time2.

 

But my problem is that the images i am getting, are already not Real-Time, they have a delay of about 5 seconds, and some of them are blurred out or the signal is not transmitted to the Front Panel.

 

This is the . VI that I am using. Because I need to calibrate the surface area from 2D to 3D, so the speed can be measured correctly, I am using a image mask to calibrate the different ROIs in my object with a respective calibration factor (a multiple which defines the difference from 2D to 3D)

 

Still, my problem is that I am supposed to be able to measure this speed in real-time *maybe max. 1-2 seconds delay* but the images themselves are already lag. A delay of about 5 seconds is happening and still some of the Image Out, when running water on the sphere and the program is running, the images lose touch from the incoming signal and nothing is seen more than a colorful image which i guess is that signal error

0 Kudos
Message 1 of 3
(2,667 Views)

I think in your case whats slowing down is that you are waiting first for the image info to be written to a file, when you can process the acquired image straight away. Also, removing the Wait (ms) delays will help.

0 Kudos
Message 2 of 3
(2,663 Views)

Hi datatechNDT, Thanks for replaying

 

I have tried removing the delay timers, but all the images in the front panel show this type of image errorsignal.png

 

0 Kudos
Message 3 of 3
(2,646 Views)