12-22-2005 01:18 AM
Lorne,
Sorry it has taken a while for me to get back to you. I was ill on Wednesday and unfortunately missed a day on this project. Atleast today I have some images to show you.
To answer you last questions:
1. Cameras don't have auto anything.
2. Lighting may not be brilliant, but has been OK for the previous system which uses similar resolution analogue cameras.
3. Cameras are firmly mounted from a floor to ceiling bar with camera mounts attached. Doesn't mean there is absolutely no vibration, but it has never been evident in the previous system output.
Before you look at the images, I should say that the graphs (from our interfaced quality system) are several days old, and some improvement has been made since then through implementing a 1st order Butterworth filter plus a peak hold mechanism on the data being sent to the quality system. Even so, the attached images illustrate the underlying variation I've been trying to describe. The 'proof of the pudding', so to speak, is the quality of the printed paper graph which is sent to our customers. Later today I will scan two examples and post. One will be a 'good' graph from our old system and the other an example of best achieved so far with the new arrangement. The printed graphs tend to look better then the screen presentations generally, as they contain fewer points and only show peak values. That is, no negative transitions appear in the printed graphs. Although I am technically happy enough with the graphs produced .... our quality people are not! I have continually assured them that the quality characteristics obtained from the 2 arrangements will be virtually identically .... but they remain unimpressed!
Unfortunately, due to possible issues when production commences post-Christmas, I decided today to remove the new system for the time being and revert to our aging and rather temperamental old system. I'm going to try and set up a reliable testing arrangement in my office to continue investigating the system issues.
Anyway, to the images ....
'ncxtop.png' and 'ncxbottom.png' are snapshots taken this morning of the top and bottom camera views. They give an idea of the lighting levels. The cable ties are the gauge markers to be detected. I achieve this with simple edge detection through the centre of the sample.
'raw.doc' shows a relatively early screen image based on raw i.e. unfiltered extension measurements gained from the new extension measurement system. Our terminology for the system is 'NCX' meaning 'non-contact extensometer'. I had another image showing a graph based on averaged data, but it now seems to refuse to copy from the floppy!!!
In 'raw.doc' you'll see that the x axis is from 0 - 1.5%. This is percentage extension of 600mm. So 0.1% is 0.6mm. Field of view of the camera is around 150mm (probably a bit more). This gives 1 pixel = 0.23mm = 0.038%. This is close to the size of the stepwise variations in extension seen in the graph, and is what I need to reduce. As I've mentioned earlier, I expected to see less variation than this from the system. As you will see when I post the printed graphs, the printed output looks much cleaner, but still shows significant variation.
I don't know if the attached images will give you any new clues, but atleast they may make what I am trying to achieve a little clearer. Our old system seems to (by means unknown) produce a more precise extension measurement, but is extremely temperamental to both operate and calibrate! The NI system is a breeze to operate and calibrate, but I need to improve the precision obtained so far to satisfy our users and customers.
As always, your assistance is greatly appreciated!!
I now have the printed graphs as pdf files and will attach to the next post.
Thanks again,
Greg
12-22-2005 01:25 AM
Lorne,
Attached are pdf files of graphs, which are usually signed by our testers and sent to customers along with additional test certification. 012730.pdf is a 'good' graph produced by our old system on 15th December. 000005.pdf was produced today using the NI system. Pay no attention to the 'REJECT' imprint. This sample has actually been stresses several times and no longer has the expected material properties. The 'quality' of the curve is the point in question.
Greg
12-28-2005 08:40 AM
Greg,
Nice Test System. Hopefully this will eventually save the quality people a lot of time.
Both ncxtop.png and nxcbottom.png do not look sharp enough. When I look at ncxtop.png I can see between 3 and 4 pixels which represent the edge of the zip tie. You should, in the proper environment, have your zip tie's edge completely defined within 2 pixels. I get the 3 to 4 pixels by counting pixels in the blur in the image which defines the edge of the zip tie.
Two things to try:
1. Increase the light applied to the zip tie (use a flashlight if you have to). See if the noise is reduced, also check to see if the edge is better defined in the picture.
2. Move the camera futher away/closer to the zip tie and make sure that the camera is focused on the zip tie and not the cable (or somewhere else).
I think that once the quality of the image improved to the point that the edge is defined within 2 pixels, you should start seeing the precision you want.
A couple of other questions...
1. I noticed that the top and the bottom of the zip tie are in different locations then the middle of the zip tie. What do you do to account for this variation?
2. Since the zip tie has a noticable thickness, if the camera is not pointed directly above the zip tie, you will see the side and the top of the zip tie with the camera. If initially the camera was directly above the zip tie and you could only see the top of the zip tie, and then during the testing of the cable the side of the zip tie came into view, you would get a faulty reading. How do you account for this?
12-28-2005 06:41 PM
12-28-2005 06:43 PM
12-29-2005 04:50 PM
12-29-2005 05:49 PM
01-02-2006 08:18 PM
Lorne,
Hi and Happy New Year!!
I'm back at work today and keen to see if I can make some improvement to the system. Have to say that indications so far aren't encouraging. As I mentioned last week, first task I set myself was to set up a test rig to give a clearer indication of normal system operation. I'll send a photo of the rig tomorrow, as soon as I can get hold of a camera. It is pretty basic! Just the camera on a standard camera tripod, facing a fine Vernier gauge we use for calibrating our mechanical extensometers. Anyway, the rig seems to work fine.
The data in the attached spreadsheet was obtained from a simple single camera application in Vision Builder AI. It detects the edge of a zip tie applied to the Vernier gauge. With the Vernier in its initial position the application is started. Then the Vernier is continuously and evenly moved, by hand, through a 5mm range. The AI application reports the raw X value of the edge pixel to COM1. These values were captured using HyperTerminal and then imported into Excel for presentation.
unfortunately the graphs for both cameras show quite distinct transitions of approximately 0.8 pixels. I have to say that I don't understand the cause of these large edge transitions. If you can give me any idea as to the likely cause or how I might be able to minimise them I would be extremely grateful. I will continue investigating effects of lighting and different gauge markers. These results are for a black zip tie on a aluminium body (see attached image). at the moment I'm only using ambient lighting. Contrast could be better, but the symptom indicated in the graphs is what I expect has also been evident in the plant installation.
Thanks,
Greg Shearer
01-02-2006 09:30 PM
Lorne,
I may be on to something! Just for fun (more or less) thought I'd try pattern matching of the zip tie instead of edge detection. Unless I'm missing something .... the result is fantastic! This is the sort of outcome I'd hoped to achieve with edge detection. Any idea why one method should have so much better apparent resolution than the other? Are there likely to be any catches you're aware of associated with this technique? Test arrangement was unchanged from that for edge detection.
I'm not sure if this approach will work as well in the production system, but certainly looks encouraging!
Greg
01-04-2006 08:57 AM