09-23-2022 11:37 PM
Rather than take a previous thread off topic, I'm starting a new thread.
For some reason I am experiencing noticeable slowing of LabVIEW's development environment after sending a pulse waveform through a FFT system I'm working on. I think it's due to the many data points making up the x-y plots. I have two plots being generated, one for magnitude and the other for phase. Working with simpler periodic sinusoids I didn't have an issue, really, but as soon as I sent a pulse in, where it is known a very short pulse will produce a very wide bandwith sinc response, LabVIEW development operations very noticeably slowed. My plots were generated from two seconds of data sampled at 1.25 MHz; 2.5 million waveform points.
I have reduced the options/environment/compiler to 0, did it back when I was just dealing with periodic sinusoids. I checked to make sure it still had the same setting and it does. Are there other ideas for how to resolve this slowing problem? For instance, to check that mentioned setting it took maybe five seconds just to get activity from my clicking "Options," getting that first window to appear. I wasn't sure if I'd actually clicked it, but then eventually it responded. Is there a solution to thus noticeable slowing of the development environment, or am I just stuck with it due to my graphs retaining all that information? ...which I'm seeing as giant arrays of retained data, where I seem to recall reading once about forcing LabVIEW to start fresh somehow. I don't know.
Thank you.
Solved! Go to Solution.
09-24-2022 12:07 AM
One other detail I should add is the graphs of the system are on multiple pages of a tab control: 1) configure, 2) f(t), 3) Fourier, 4) Save. f(t) displays an x-y plot of the 2.5 million point waveform sent into the system. The x-y Fourier plots each contain 1.25 million points running from -625000 to 625000. Hence, I may have just blown up LabVIEW, and there's nothing I can do about it, really.
I'm wondering if there may be some mechanism I don't know about which would allow me to do my analysis but, then, choose whether I want the graphs left behind to retain the data during development or not, assuming that's the real cause behind the slowing. Is that too crazy of an idea? For instance, I'd drop a boolean control on the front panel that if set would cause the data to be retained (thus slowing the development environment, by choice) but that if cleared would lose the data and give me back development speed. It sounds like a property node option, maybe. Then again maybe it's a method call? I don't know, yet. I just know this knowledge base is good for getting answers in parallel with my hunts. (The digging is all about finding the correct keywords; that can be tough.)
09-24-2022 12:40 AM
It looks like I have an answer.
Indeed there is an invoke node function that will send the plots back to default. I did as I said and created a boolean on the FP that now gives me the choice of whether to have or not to have a slowed development environment, by choosing whether or not to retain the data. It makes a world of difference and proves that the issue was indeed the retained data.
I found that keeping the waveform itself wasn't the tax, as I learned that despite my instructions to clear all the graphs, only the graphs being displayed were cleared. Hence, graphs on other unselected tabs still retain their data is my conclusion. (Yes, I'm sure there is a way to get around this, likely using property nodes and control references.)