From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

IMAQdx timestamp off-by-one

Solved!
Go to solution

I have a network camera and it provides its own timestamp via the IMAQdxTimestampHigh and IMAQdxTimestampLow properties. I'm observing something I don't expect -- the timestamp appears to be off by one image.

 

I have a simple VI that shows this behavior. In it, I open an IMAQdx session with my camera, configure a grab, reset the timestamp counter, and read the images sequentially. When examining the timestamp associated with each image, the first image has what appears to be the Nth+1 image's timestamp from the previous grab, as if stale data was left in memory.

 

To see this behavior, run the "Exercise Timestamp Reset.vi" twice. Before you run it the second time, take note of the value for "Last Timestamp". When you run it the second time, the value for "First Timestamp" is slightly higher than "Last Timestamp", which suggests stale data. I would expect "First Timestamp" to be close to zero.

 

What could be causing this behavior, and how can I fix my code? So far, I've tried:

  1. Moving the "Reset Timestamp Counter.vi" so that it is called immediately after creating the IMAQdx session,
  2. Creating an IMAQdx session, resetting the timestamp counter, and clearing the session before creating a second IMAQdx session for the grab.

My particular configuration:

  • Allied Manta MG 504B ASG
  • LabVIEW 2009 SP1
  • IMAQdx 3.9.1
0 Kudos
Message 1 of 12
(4,560 Views)
Solution
Accepted by topic author MacNorth

Hi,

 

That is a really nice, clean example you provided!

 

I was not able to reproduce this with either a Basler Ace or AVT GX1050 cameras that I had handy (the first timestamp was always correct). Given that IMAQdx decodes these directly from each image as they come in it seems unlikely that it should be able to cause this condition unless the camera was actually sending it.

 

My suspicion is that this is a firmware issue on your Manta camera. Can you see if there is a firmware update you can install?

 

Additionally, if you could provide a network trace from something like wireshark showing the traffic during the second acquisition it should be very easy to narrow down if it is in the camera or IMAQdx.

 

Eric

Message 2 of 12
(4,549 Views)

BlueCheese wrote:
 

My suspicion is that this is a firmware issue on your Manta camera. Can you see if there is a firmware update you can install?


I've started that inquiry. It seems that Allied doesn't provide downloads from their site. I hope they are responsive.

 


BlueCheese wrote:

 

Additionally, if you could provide a network trace from something like wireshark showing the traffic during the second acquisition it should be very easy to narrow down if it is in the camera or IMAQdx.


I've attached what you requested, I think. I configured Wireshark with a capture filter using my camera's IP via 'host nnn.nnn.nnn.nnn'. I saw register IO, but didn't see the image payload.

0 Kudos
Message 3 of 12
(4,536 Views)

Hi,

Sorry, when using the High Performance Driver (which I'm guessing you are) it will process all the GigE Vision traffic before it has a chance to get to the Wireshark driver. You could either switch back to the Intel driver or I believe you can set a hidden bool attribute called "UseHighPerformanceDriver" to false that will force the driver to use the standard path even though the high performance one is present (from the default configuration you may need to also either enable Jumbo Frames for "all traffic" or reduce the packet size to 1500). This will allow the wireshark capture to have the image data as well.


Eric

Message 4 of 12
(4,534 Views)
Whoa. Thanks for your insta-reply 🙂

BlueCheese wrote:

Hi,

Sorry, when using the High Performance Driver (which I'm guessing you are) it will process all the GigE Vision traffic before it has a chance to get to the Wireshark driver. You could either switch back to the Intel driver or I believe you can set a hidden bool attribute called "UseHighPerformanceDriver" to false that will force the driver to use the standard path even though the high performance one is present (from the default configuration you may need to also either enable Jumbo Frames for "all traffic" or reduce the packet size to 1500). This will allow the wireshark capture to have the image data as well.


Hrm, this isn't working. Using the basic ideas outlined in a troubleshooting article, I was able to switch my camera NIC driver to the "Intel(R) PRO/1000 GT Desktop Adater" driver. The card's PID is 0x107C, and I downloaded the driver directly from Intel.

 

The driver properties dialog box does not have an Advanced tab, so I cannot enable jumbo frames. However, I cannot take a snap in MAX when lowering the packet size as suggested by the article. After setting the Packet Size to 1024, clicking Save, and then clicking Snap, I receive error 0xBFF69031: "The system did not receive a test packet from the camera. The packet size may be too large for the network configuration or a firewall may be enabled." Other sizes failed as well: 128, 500, 1500.

 

When I switch back to the NI driver "National Instruments GigE Vision Adapter", I can perform a snap in MAX with the smaller packet size.

0 Kudos
Message 5 of 12
(4,530 Views)

I'm guessing the Windows firewall is on (the High Performance Driver also bypasses this). You could either disable it or if you are using a recent version of IMAQdx (and your camera firmware supports it) there is an attribute called Firewall Traversal that you can enable.


Eric

Message 6 of 12
(4,526 Views)

BlueCheese wrote:

 

I'm guessing the Windows firewall is on (the High Performance Driver also bypasses this).


After perusing the readme, I would like to request that the "subvert the firewall" default behavior is at least called out -- talk about a surprise security breach. At least on my system, I see that the driver is now signed 🙂

 

The firewall is indeed enabled, but both MAX and LabVIEW have pass-through exceptions. Other than a blanket-disable, what other entities need access granted?

0 Kudos
Message 7 of 12
(4,522 Views)

Hi MacNorth,

 

If you need to keep your firewall on you will also need to allow UDP ports 3956 and 49152-65535(GigE Standard) through your firewall I believe. To do this, please follow the directions below (this is for Windows 7).

 

1) Open Windows Firewall with Advanced Security. On the left, you should see Inbound Rules.
2) Right-click Inbound Rules and choose "New Rule."
3) Under Rule Type, choose "Port."
4) Under "Protocol and Ports," under "Does this rule apply to TCP or UDP?", choose "UDP." Under "Does this rule apply to all local ports or specific local ports?" choose "All local ports." Or select the above ports
5) Under "Action," choose "Allow the connection."
6) Under "Profile," keep all boxes checked.

Tim O

Applications Engineer
National Instruments
0 Kudos
Message 8 of 12
(4,500 Views)

@MacNorth wrote:

@BlueCheese wrote:

 

I'm guessing the Windows firewall is on (the High Performance Driver also bypasses this).


After perusing the readme, I would like to request that the "subvert the firewall" default behavior is at least called out -- talk about a surprise security breach. At least on my system, I see that the driver is now signed 🙂

 

The firewall is indeed enabled, but both MAX and LabVIEW have pass-through exceptions. Other than a blanket-disable, what other entities need access granted?


Maybe I used the wrong terminology here.... Its not so much "subverting the firewall" as processing the GigE Vision-related traffic at the lowest layer in the network driver, well before it gets handled by the network stack and any firewalls in the system. This is somewhat implied by using the "High Performance GigE Vision Driver". The firewall is still in place for any other traffic to the system.

 

When you are _not_ using the high-performance driver, the GigE Vision data does go through the firewall. However, since the reception is done in the OS kernel for performance reasons the firewall doesn't work quite the same way as through applications. The various exceptions you can modify on the list relate to individual programs, not the kernel. The kernel effectively has unrestricted access to the network stack. However, because of the way the firewall works it does end up blocking UDP traffic that is not "connection-based" -- in that UDP inbound traffic is not allowed unless outbound traffic along the same path was recently detected. In the original GigE Vision 1.0 standard there was always a unidirectional path with no way to make it appear bidirectional. In later versions a feature was added that made it possible and today most cameras support it. This is the "Firewall traversal" feature that I mentioned earlier that can be enabled inside IMAQdx. This is likely the easiest option and doesn't require any changes to your firewall configuration.

 

Eric

 

 

Message 9 of 12
(4,497 Views)

BlueCheese wrote:
Maybe I used the wrong terminology here.... Its not so much "subverting the firewall" as processing the GigE Vision-related traffic at the lowest layer in the network driver, well before it gets handled by the network stack and any firewalls in the system. This is somewhat implied by using the "High Performance GigE Vision Driver". The firewall is still in place for any other traffic to the system.
When you are _not_ using the high-performance driver, the GigE Vision data does go through the firewall. However, since the reception is done in the OS kernel for performance reasons the firewall doesn't work quite the same way as through applications. The various exceptions you can modify on the list relate to individual programs, not the kernel. The kernel effectively has unrestricted access to the network stack. However, because of the way the firewall works it does end up blocking UDP traffic that is not "connection-based" -- in that UDP inbound traffic is not allowed unless outbound traffic along the same path was recently detected. In the original GigE Vision 1.0 standard there was always a unidirectional path with no way to make it appear bidirectional. In later versions a feature was added that made it possible and today most cameras support it. This is the "Firewall traversal" feature that I mentioned earlier that can be enabled inside IMAQdx. This is likely the easiest option and doesn't require any changes to your firewall configuration.


Thanks for your clarification -- I'm not so alarmed 🙂

 

I was able save another Wireshark capture after enabling Firewall Traversal, and this time I saw a lot more traffic. I set the camera's PacketSize to 1200 and saw many UDP transfers just above that size.

 

One thing that was a bit strange: I could only enable Firewall Traversal from MAX via View Options » All Attributes and then checking the box for AcquisitionAttributes » AdvancedEthernet » FirewallTraversal » Enabled. I didn't see an equivalent in the LabVIEW Class Browser: Firewall Traversal was not present in the Advanced Ethernet submenu. I didn't see any filtering options for the property node either (I've seen that before in DAQmx). It was only discoverable as AcquisitionAttributes::AdvancedEthernet::FirewallTraversal::Enabled by using IMAQdx Enumerate Attributes.vi. I suppose that the driver doesn't keep a cache of some attributes and relies on run-time detection instead.

 

At any rate, I would be grateful if you can inspect whether or not the camera is sending bad timestamps. I am still waiting for a response from Allied about a firmware update.

0 Kudos
Message 10 of 12
(4,476 Views)