Showing results for 
Search instead for 
Did you mean: 

Remove 'Scrolling Effect' for Intensity Graph

Go to solution



I'm running a parallel data acquisition and display process in which I read a 80x64 array of data from a Heimann (Boston Electronics) infrared sensor at a rate of roughly 9 FPS. The data is obtained via a UDP read command, after which I need to aggregate 11 readings to actually receive the entire 80x64 frame. After processing these 11 messages into a 2D array indicator, I can see the 80x64 refreshing at a high speed (probably 9 FPS). When attaching this array of data to an Intensity Graph indicator, however, I get a 'sliding' effect roughly every second. The images displayed are in real-time and appear as a video, but all data on the screen shifts to the left with a refreshed replacement roughly every second. Is there a way to turn that 'sliding refresh' feature off so that the new data just replaces the old (so it looks like a video). In it's current state the output looks like a video, but it slides offscreen and is replaced every second. My VI is attached for reference, along with two screenshots. 



0 Kudos
Message 1 of 11

What you describe sounds like an intensity chart (which is different than a graph). Please place a new intensity graph, wire it to your data, and check again.

0 Kudos
Message 2 of 11

Hi Gregory,


No this is definitely a graph, I realize the chart has an internal buffer and updates indices accordingly. On my UI the x and y limits stay at 80x64 so I know it's updating correctly, but I'm trying to figure out if that sliding effect is being caused by the camera hardware or my LabVIEW code.

0 Kudos
Message 3 of 11

After looking more closely at the data, this is definitely a problem with the camera not the LabVIEW code. If you have any advice that'd be great, but if not I'll remove this question. 



0 Kudos
Message 4 of 11
Sorry about that, I had to boot up a different machine to see your VI. Hmm, I cannot replicate what you're seeing by just generating random data, so I would guess it has to do with your data transmission, and is not an effect of the graph.
It looks like you have some data out of your expected range (white rectangles at top and bottom around x=40).
Just a shot in the dark, but since you said it happens once every second, what happens if you change your timeout in the top loop to something like 5000 instead of 1000?
0 Kudos
Message 5 of 11
Can you explain your packet size as well? It's 1283 bytes, I guess 1 byte is a header leaving you with 1282. From this you make 641 16-bit data. You collect 11 packets to make a 7,051 element array. Then you reshape it into a 5,120 element array.
Are you throwing away 3 of 11 packets? That gets you close but still 5,128 elements, so 8 elements unaccounted for.
0 Kudos
Message 6 of 11

Haha all good. Changing the timeout was a great suggestion, but didn't fix the issue. I've attached the transfer protocol from Heimann sensor, but yes all remaining data is useless to me. They send out 11 packets of 1283 bytes, with some of the beginning and ending bytes used for other information.

0 Kudos
Message 7 of 11

I'm thinking I must be off by some small number of bytes for each frame, resulting in the entire 80x64 array shifting over time.

0 Kudos
Message 8 of 11

Wait I fixed it! For reading a single frame I had to loop 11 times (10 reads plus one STOP message). The continuous reading has no stop message, thus the extra loop is unnecessary and shifting the data. 


In any case, thanks for all your help with this troubleshooting! 

0 Kudos
Message 9 of 11
Accepted by topic author trooper64

According to that document, you should get 10 packets per frame, not 11. Also, it looks like you only care about the temperature data from packet #8, not the "el. offset".

I would continue deleting the packet index, and then stitch together all 10 frames of data. Then take an array subset (of your 16-bit data) from index 0, length 5120 elements, and reshape that into your 2D intensity array.

Message 10 of 11