Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

Measuring Surface Treatment Bleeding

I am currently trying to develop a script within Vision Assistant that can calculate the percent of bled area on the surface of the roadway. In chip seal construction, bleeding occurs when there is too much emulsion (tar) and not enough aggregate (rocks). The emulsion eventually sheens at the surface of the roadway causing skid resistance problems among other things.

 

Because the emulsion is completely black and the aggregate is not (immediately after construction), it is possible to visually examine the roadway and tell if bleeding has occurred. But I would like to be able to automate this process using a camera and Vision Assistant, and find out exactly what percentage of the captured roadway surface is bled.

 

The images I am trying to process look like this:

 

 

The emulsion that I'm trying to locate can be seen along the right side of the aggregate as the small black "gaps" between the rocks (ignore the black felt border on both sides).

 

Right now, my script is set to equalize the image and then select a threshold value that isolates these pixels, but is there a more accurate way to do this analysis? With our camera, the lighting conditions will always be the same, but there is not a "reference image" that we can work off of to determine the correct percentage.

 

I just need to be able to analyize the same type of image and determine the percentage of bled (completely black) area. Thanks in advance for your suggestions.

 

 

0 Kudos
Message 1 of 7
(4,221 Views)
This looks like a 3D vision problem, check out www.cyth.com/3d.  This uses a National Instruments Enhanced Vision System, and Vision Builder for Automated Inspection to allow the ease of LabVIEW in a 3D inspection environment.  Just a thought.
0 Kudos
Message 2 of 7
(4,210 Views)

I can see two problems with your image:

  1. Poor contrast between the color (grayscale) of aggregate and the emulsion. I'm not sure if this can be improved. Provided that you are using a well defined (laboratory) illumination setup, you might try using different colors of lighting, e.g. by using color filters in front of the lights. Or you might even try an IR light source. The contrast ratio can be higher when using different light color, but everything depends on the material properties.
  2. Light reflections on the emulsion. These make a lot of pixels much brighter that you'd like them to be. You may try using the classic anti-reflection setup. I.e., putting light-polarizing filters in front of the lights and a second (movable) polarizing filter in front of the camera lens.

View my profile on LinkedIn
0 Kudos
Message 3 of 7
(4,191 Views)

I would have to agree with Vladimir. 

 

Because you are working with pretty much black and I'm guessing a very dark shade of grey, lighting becomes extremely important to ensure a robust application.

 

 

Tejinder Gill
National Instruments
Applications Engineer
Visit ni.com/gettingstarted for step-by-step help in setting up your system.
0 Kudos
Message 4 of 7
(4,160 Views)
I am currently using the Balluff Sharpshooter controlled by Machine Vision software to capture the images. This camera uses red LED lights to illuminate the surface. I will try to change the light color, but it may be better to use an outside light source. Unfortunately, the machine that I'm developing is for use in the field, so controlling light conditions could be a bit difficult, but I'll take that into consideration. Thank you for the help 🙂
0 Kudos
Message 5 of 7
(4,157 Views)
Is it just me who is not able to view the image from the first message? One thing you can do to control ambient lighting is to use a very high intensity light and operate the camera with very very low aperture. This what i do when the system is going to be running outdoor. This will definetly give you consistency.
0 Kudos
Message 6 of 7
(4,122 Views)

I'm sorry if the image isn't showing up, I'm kind of new to this 🙂

 

http://img175.imageshack.us/img175/2920/after78msinglesample3.png

 

Unfortunately, with the SharpShooter I don't think it's possible to control the aperture. It works much like a desktop computer scanner where the focal length is fixed and the camera only shoots what's right in front of it (directly down in my case). This camera is used more for assembly line quality control to make sure there aren't defects in manufacturing parts, but it was within our price range and I'm hoping we can get NI Vision Assistant to analyse the data in the way we want.

0 Kudos
Message 7 of 7
(4,105 Views)