From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
04-14-2021 10:30 AM
Hi,
I'm running a parallel data acquisition and display process in which I read a 80x64 array of data from a Heimann (Boston Electronics) infrared sensor at a rate of roughly 9 FPS. The data is obtained via a UDP read command, after which I need to aggregate 11 readings to actually receive the entire 80x64 frame. After processing these 11 messages into a 2D array indicator, I can see the 80x64 refreshing at a high speed (probably 9 FPS). When attaching this array of data to an Intensity Graph indicator, however, I get a 'sliding' effect roughly every second. The images displayed are in real-time and appear as a video, but all data on the screen shifts to the left with a refreshed replacement roughly every second. Is there a way to turn that 'sliding refresh' feature off so that the new data just replaces the old (so it looks like a video). In it's current state the output looks like a video, but it slides offscreen and is replaced every second. My VI is attached for reference, along with two screenshots.
Thanks!
Solved! Go to Solution.
04-14-2021 11:05 AM - edited 04-14-2021 11:05 AM
What you describe sounds like an intensity chart (which is different than a graph). Please place a new intensity graph, wire it to your data, and check again.
04-14-2021 11:15 AM
Hi Gregory,
No this is definitely a graph, I realize the chart has an internal buffer and updates indices accordingly. On my UI the x and y limits stay at 80x64 so I know it's updating correctly, but I'm trying to figure out if that sliding effect is being caused by the camera hardware or my LabVIEW code.
04-14-2021 11:27 AM
After looking more closely at the data, this is definitely a problem with the camera not the LabVIEW code. If you have any advice that'd be great, but if not I'll remove this question.
Thanks!
04-14-2021 11:30 AM
04-14-2021 11:47 AM
04-14-2021 01:21 PM
Haha all good. Changing the timeout was a great suggestion, but didn't fix the issue. I've attached the transfer protocol from Heimann sensor, but yes all remaining data is useless to me. They send out 11 packets of 1283 bytes, with some of the beginning and ending bytes used for other information.
04-14-2021 01:22 PM
I'm thinking I must be off by some small number of bytes for each frame, resulting in the entire 80x64 array shifting over time.
04-14-2021 01:38 PM
Wait I fixed it! For reading a single frame I had to loop 11 times (10 reads plus one STOP message). The continuous reading has no stop message, thus the extra loop is unnecessary and shifting the data.
In any case, thanks for all your help with this troubleshooting!
04-14-2021 01:42 PM - edited 04-14-2021 02:10 PM
According to that document, you should get 10 packets per frame, not 11. Also, it looks like you only care about the temperature data from packet #8, not the "el. offset".
I would continue deleting the packet index, and then stitch together all 10 frames of data. Then take an array subset (of your 16-bit data) from index 0, length 5120 elements, and reshape that into your 2D intensity array.