I'm running a parallel data acquisition and display process in which I read a 80x64 array of data from a Heimann (Boston Electronics) infrared sensor at a rate of roughly 9 FPS. The data is obtained via a UDP read command, after which I need to aggregate 11 readings to actually receive the entire 80x64 frame. After processing these 11 messages into a 2D array indicator, I can see the 80x64 refreshing at a high speed (probably 9 FPS). When attaching this array of data to an Intensity Graph indicator, however, I get a 'sliding' effect roughly every second. The images displayed are in real-time and appear as a video, but all data on the screen shifts to the left with a refreshed replacement roughly every second. Is there a way to turn that 'sliding refresh' feature off so that the new data just replaces the old (so it looks like a video). In it's current state the output looks like a video, but it slides offscreen and is replaced every second. My VI is attached for reference, along with two screenshots.
Solved! Go to Solution.
No this is definitely a graph, I realize the chart has an internal buffer and updates indices accordingly. On my UI the x and y limits stay at 80x64 so I know it's updating correctly, but I'm trying to figure out if that sliding effect is being caused by the camera hardware or my LabVIEW code.
After looking more closely at the data, this is definitely a problem with the camera not the LabVIEW code. If you have any advice that'd be great, but if not I'll remove this question.
Haha all good. Changing the timeout was a great suggestion, but didn't fix the issue. I've attached the transfer protocol from Heimann sensor, but yes all remaining data is useless to me. They send out 11 packets of 1283 bytes, with some of the beginning and ending bytes used for other information.
I'm thinking I must be off by some small number of bytes for each frame, resulting in the entire 80x64 array shifting over time.
Wait I fixed it! For reading a single frame I had to loop 11 times (10 reads plus one STOP message). The continuous reading has no stop message, thus the extra loop is unnecessary and shifting the data.
In any case, thanks for all your help with this troubleshooting!
According to that document, you should get 10 packets per frame, not 11. Also, it looks like you only care about the temperature data from packet #8, not the "el. offset".
I would continue deleting the packet index, and then stitch together all 10 frames of data. Then take an array subset (of your 16-bit data) from index 0, length 5120 elements, and reshape that into your 2D intensity array.