Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

USB3 timestamp IMAQ

I have a Basler Ace USB3 camera attached to a PCIe-8242 Vision Frame Grabber board. The system is integrated with my system well, and I am successfully recording AVI files. I would like to record the timestamp data from each image taken during the AVI. I have changed 'Receive Timestamp Mode' to 'System Time', and have inserted 'Get Custom Keys' and 'Read Custom Data' into the code. I want to have an indicator on the UI, and record the timestamp data to a text file, but the data recorded comes out as gibberish. I have tried converting types (e.g. Hex string to number) without success.

 

What type of data comes from the Basler ACE? And is the translation of timestamp data from the camera a problem with the driver, or the labview software?

0 Kudos
Message 1 of 8
(4,953 Views)

Hi,

 

The "system time" key(s) come from the driver, not the camera. They are timestamped as close to possible to when the frame is received so you should still get more accurate timestamps than just doing it inside of your code. There are other timestamps inserted by the camera, but for USB3 Vision cameras these come from a free-running clock in the camera and are not synchronized to the outside world, so are probably only useful for relative timestamps.

 

All the timestamps keys stored as custom data are encoded as 32-bit big-endian values. You can use LabVIEW's unflatten from string VI to get them as numerics, and combine the two 32-bit halves.

 

Eric

Message 2 of 8
(4,938 Views)

I figured out how to get this done using a U32 array. It gives me an integer and counts up from 0, until the U32 is maxed out. The count correlates with elapsed time, which is fine for relative time, but I'd like to sync it with other data coming in from the DAQ.

 

If the driver is giving the timestamp, then it should not be relative time, but drawn from the CPU, so we can compare driver timestamp, and DAQ timestamp. Correct?

 

So how would I go about converting the U32 integer into timestamp data?

 

Thanks,

Winston Elliott

0 Kudos
Message 3 of 8
(4,897 Views)

Hi Winston,

 

Check out these examples:

http://digital.ni.com/public.nsf/allkb/2EFA91B04F42EF60862578F600806076

 

Eric

0 Kudos
Message 4 of 8
(4,894 Views)

Thanks BlueCheese,

 

I built the initial VI using those examples. In unflattening the timestamp key data, the data type is an array of U32-long integers. In the example posted below, only the first cell of the array contains data, as an integer which maxes out at 4,292,967,295, and then recycles. Reps for Basler state that each tick counts for 8ns, but in comparing ticks to timestamp data I see a relation of about 95 ns to each tick. The rising integer value indicates a relative timestamp which indicates it comes from the hardware, but the tick/time relation is off by a factor of 10 from Basler specs. Resulting tick count and timestamp data are attached. I think this will work okay for me, as the jitter is beneth my required Nyquist frequency.

 

Winston Elliott

Download All
0 Kudos
Message 5 of 8
(4,850 Views)

Hi Winston,

 

I suspect you are looking at the custom keys IMAQdx adds corresponding to System Time when the image was recieved, rather than the timestamp sent by the camera itself (both are available in IMAQdx). I think the system time was in 100ns ticks, but I'd have to look it up to confirm. The Basler cameras definitely use 8ns ticks in their timestamp. They have an attribute you can read that indicates the frequency that should read 125Mhz.

 

As to which timestamp to use, it depends on your use case. If you only care about relative time between frames over a short interval, the camera's timestamp will be far more accurate because it is recorded in hardware by the camera. If you want to correlate timestamps to real-world time and can deal with some added jitter, the system time may be a better choice.

 

Also, looking at the image of your VI, I think your timestamp retrieval is still incorrect. You should simply use the unflatten VI with a U64 datatype passed into it (and it configured for network-byte-order). Passing an array like you are doing is unlikely to give you the correct interpretation without doing a lot of re-arranging of the bytes. This is presumably why you see the value truncated to 32-bits instead of 64.

 

Eric

0 Kudos
Message 6 of 8
(4,838 Views)

Whelp, the 100 ns explains the ticks I was getting, and 95 probably results from jitter. Additionally, a different output key gives 1 ns/ tick (index 5 from output keys).

 

Moreimportantly, I am not seeing a U64-bit variable for the timestamp output. No matter how I configure the unflatten function, I am only able to use it with a U32 array. Without this, I receive an error which prevents running of the VI.

 

Thanks,

Winston Elliott

0 Kudos
Message 7 of 8
(4,833 Views)

Er, sorry. There are separate custom data keys for the upper and lower halves of the 64-bit timestamp. You'd unflatten them each as 32-bit, big-endian U32s, then combine them into a U64.

 

Eric

0 Kudos
Message 8 of 8
(4,811 Views)