I've got a CV-A10GE camera from JAI. It seems to work fine using Stemmer Imaging's software (GenICam), but I can't get it working in LabView. It appears as cam0 in MAX, but when I try to highlight it in MAX I get an error dialog:
The session for your device could not be configured.
In LabView I can get the camera to show up using IMAQdx Enumerate Camers.vi, but I can't get any images out of it. When I try to open the camera I get:
Error -1074360319 occurred at IMAQdx Open Camera.vi
NI-IMAQdx: (Hex 0xBFF69001) Internal error
I should note that I've removed all the Stemmer Imaging CVB related software, and the JAI SDK, so it's only LabView trying to access this camera. I have another USB CMOS camera working in LabView on the same system, so it's definitely a GigE problem. Any suggestions anyone?
Try to delete the xml, iid and icd located in Documents and Settings/All Users/Shared Documents/National Instruments/NI-IMAQdx/Data, then unplug your camera and plug it again.
If this doesn't resolve your problem, execute the Camera Validator as described here http://digital.ni.com/public.nsf/allkb/EBE3BBBEF4F
It can help identify the problem.
Thanks for the response toto. I deleted the XML file ealier and that did help. I had an empty one, presumably from an earlier attempt to run the camera on a broken cable. I'm now getting:
FAIL: Error Code = 0xBFF69031: The system did not receive a test packet from the camera. The packet size may be too large for the network configuration or a firewall may be enabled.
from all tests. This is with or without the firewall activated.
Running CameraValidator tells me the same thing for every frame attempt. Using /ATTRIBUTES shows all tests passing except:
Did you enable Jumbo Frame on your NIC? If no, set it to the maximum value. Then check yout firewall is disabled.
This kind of error you get often deals with it.
I don't know what can cause the value attributes out of range error. Do you have a reset attribute you can execute?
Hi - thanks again for the response. Jumbo packets is set to 9014 bytes together with "extreme" interrupt moderation rate and large 2048 byte receive buffer setting. Firewall is off. I'm still getting this complaint in MAX about not receiving a test packet from the camera.
Am I doing something REALLY stupid here do you think?
just reading some documentation on the way home. CVB requires me to set the NIC address to 169.254.x.x on subnet 255.255.0.0. Does NI-IMAQdx really require me to use DHCP and wait for the DHCP timeout (on XP)? If this is an absolute requirement then that could be the solution (although it would mean IMAQdx is not strictly GigE compliant.
Any thoughts on whether this could be the answer anyone?
Setting a 169.254.x.x address with a subnet mask of 255.255.0.0 will be sufficient. All IMAQdx requires is that the camera's address be correctly addressable by the host PC. Generally we recommend customers configure everything to "automatic" so both the camera and the host follow the link-local address mechanism described by RFC 3927. If you manually configure a 169.254/16 address then you violate this RFC and need to take care not to cause conditions that could make the the mechanism to fail. Generally though as long as you can ping the device, your IP connectivity is set up properly and IMAQdx should work correctly. Also, in Windows Vista and above, the link-local process no longer takes 60 seconds so there is no advantage to hard-code a staticlly-assigned link-local-esq address.
As for your original problem(s), I have some thoughts:
It seems that the problem with the PeakBandwidth attribute and the SCPD attribute is ultimately caused by the fact that their XML file seems to specify a maximum value of the bandwidth available that is not 1000Mbit. The test code in the attribute validator tries to set it to the maximum, then check if it is valid. When we set it, it has to be translated from Mbit/sec to camera clock ticks in the SCPD register. Generally since the max value of thr bandwidth control is derived from the range of SCPD, this number should be equivalent. However, since it is floating point math, you don't end up with an exact integer amount and there must be rounding here. It appears that IMAQdx only rounds the SCPD register _down_ even though apparently in this unique case it needs to round _up_ (which is generally not the case since most only have limits on how much you can throttle it, not how little). In general this issue should only affect the validation tool (I think). As long as you don't modify the PeakBandwith attribute (or set it to one notch below the ~996 value it maxes out at) it should not be a problem.
As for the test packet failure, have you tried changing the packet size to 1500 bytes? If that works it could indicate an issue with receiving jumbo frames (possibly other network hardware like a switch in the path)?. If it still doesn't work, you could possibly try disabling the Test Packet support under the Advanced Ethernet settings of the camera attributes tab in MAX (you might need to enable the "Show all attributes" filter on the tree first). In general when you get this failure it is not a fault of the test itself but rather that the packets aren't getting through. This means that if you disable the test packet you'll simply get a more general timeout error instead. However, there have been one or two rare exceptions where specific cameras have had "broken" test packet implementations that did not work properly. However, all of those that I know about were fixed in updated camera firmware long ago.
Hope this helps,
Fixed the peak bandwidth test failure as you described. When I disable test packet support it just gives a timeout after 5 seconds, exactly as you said. Setting packet size to anywhere from 1440 to 4000 gives the same errors when I try to acquire an image.
So as you said, it looks as though packets are not getting through. This is strange, as the equipment works with the CVB / GenICam test software. There's no network equipment between the camera and PC. It's just an Intel gigabit NIC in the PC and a cable to the camera.
I'm running out of hair to tear out. Does anyone have any further suggestions?
SOLUTION: Replace the Intel driver for the network interface card with the NI GigE Vision driver which is presented as an option when you choose "update driver" in the XP -> Control Panel -> System -> hardware list.
This would almost certainly point to a software firewall being present on your machine that is blocking the image data packets. The newer versions of IMAQdx supports a feature in later versions of the GigE Vision spec that allows firewall traversal, but this requires specific camera support to work properly (and I'm not sure if JAI supports it). However, you're better off using the NI driver anyway.