From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

any way to save memory in 3d graph

I am using ActiveX 3d graph (surface) to show a waterfall online. It requires 2d array for both axes. I refresh these three 2d arrays each time when a new spectrum comes. For my case, when the bandwidth is high, the spectrum is a very long array. The 2d array will be huge with the number of the spectrums increasing. The only way I can do is to limit the number of the spectrum in this waterfall. For example, I set 50 for the maxima number of spectrum to display and each 2d array can be N*50 (N is the length of the spectrum).
Is there any way to make it more efficient? I am wondering if I can only plot one curve each time on the previous image instead ploting the whole batch again and again.
Any suggestion is welcome. Thanks!
 
0 Kudos
Message 1 of 11
(4,067 Views)
We may be able to help if you provide a small example of your data and how you are attempting to dispaly it.
 
Could you post some data saved in a VI?
 
Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 2 of 11
(4,061 Views)
Hi, Ben,
Thanks for your reply.
Here I attach a example data and you may open it in excel. It's a 2D array. Each column is a spectrum (here it is just from a sine wave signal)
I am going to read time signal from the frontend every several seconds and generate spectrum for it. Then display the spectrum in a 3D waterfall. My current method is to make three 2D array for X, Y, Z axes for such a 3D surface. For each new spectrum, I increase the size of these 2D array.
Because of the huge size of the array, I have to limit the number of spectrum to display (say, 50 spectrums in the waterfall). When the spectrum number hits this limit, I delete the first column and add a new column at the end for each new spectrum.
It is time-consuming and I begin to wonder whether there is any better solution to do it.
 
Jiankang
0 Kudos
Message 3 of 11
(4,037 Views)

Could you please post your LabVIEW code showing us what you are doing now?

I am allowed to answer questions. Writing code requires a purchase order be in place first Smiley Wink

I will watch for you to post your example.

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 4 of 11
(4,035 Views)
Here I attach my code for some random data. Run the randwaterfall.vi to show a waterfall for 50 random data spectrum.
I generate a random array (100X1) each time. The subvi (waterfall1.vi) is used to get the spectrum and update the axes array.
When you may adjust the size of the array from 100 to some big value (for example, 10000), you may see the speed will become very slow since the array size.

Message Edited by jiankang on 02-27-2006 11:10 AM

Download All
0 Kudos
Message 5 of 11
(4,029 Views)

HI Jiankang,

There are still lmits to what computers can do. I hope your real life application does not go so hard on the CPU Smiley Surprised

10,000 data points just can not be represented by the number of pixel you have to work with. For pseudo-real-time updates keep the sample count down.

I did a quick experiment and it looks like I can do 50 plots with 100 points each and recycle them pretty quick.

Your sub-VI now looks like this

And your toplevel looks like this.

The revised example is attached.

Using this method I could keep a steady stream of updates going and still had time left to let be play with changing the view. I do not not know if this helps.

Ben

Message Edited by Ben on 02-27-2006 07:38 PM

Message Edited by Ben on 02-27-2006 07:39 PM

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 6 of 11
(4,025 Views)

Thanks so much, Ben.

Your code doesn't require update the 2D arrays all the time and runs much better than mine.

Is the size of 3d graph only limited by the computer?

0 Kudos
Message 7 of 11
(3,995 Views)
"

Is the size of 3d graph only limited by the computer?

"
 
No, but it is a major factor.
 
I have seen limitations on a 3-d mesh not liking more than 32,000 data points.
 
Physical memory is required to store the data. Going above tha available memory will force you to use Virtual memory which is 1,000 times slower than physical memory.
 
Free memory is another factor. LV wants all buffers to be contigous so fragmented memory is an issue.
 
Graphics adapter is another factor.
 
Windows OS only allows 2GB to be used by applications, that is another limit.
 
I count myself as blessed living in a time when these type of graphics are possible. When I tried similar stuff back in the day of LV 5.1 (?) the 400 MHz PC had trouble just rendeing the image, let alone updating on the fly and changing views.
 
If you think about, could you post an example that shows your real data and what you did to get the results? I am still trying to gather examples to help others with future 3-d graph questions.
 
Thank you!
 
Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 8 of 11
(3,993 Views)

I am modify my code according to your example. My task is to read the data from frontend, process it and display the waterfall.

I am changing a little to make the waterfall to move forward instead of "sweep". After the maxima number of spectrum is reached, the earliest spectrum will be removed and the new spectrum will be the last plot. I have almost done and will post it later.

0 Kudos
Message 9 of 11
(3,990 Views)

Thank you in advance Jiankang!

The best answers to questions are those that are posted by the original poster. The rest are just guesses.

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 10 of 11
(3,987 Views)