LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

references of big data.

I am making a LabVIEW 2018 program which requires to run in every minute for a long time (more than a year) without interruption.
I am worry about memory shortage while doing repeated routine in producer and consumer design pattern.
The final version will be built in execution file.

 

To my knowledge, user generated references (e.g. called by reference, CR) should be closed just after calling to main VI.

 

So, how about a reference of variable (big size) or variable itself?

Maybe, I can position them outside the repeated routine (initialization before starting while loop) like attachment.

 

Do I need to close the FP reference or reference of variable (not generated by CR)  by using close referece .vi after a unit execution? 

 

In my previous experience in other .exe version, I find there seems to be memory leakage but not sure.
How can I check the memory situation in programmatically (not in util of performance and profile)?

 

labmaster.

0 Kudos
Message 1 of 5
(1,915 Views)

For the references of objects within the VI, you don't need to close them.

 

Also, references themselves are not big.  They are the same size whether it points to an object that takes up little memory space or a lot of memory space.

 

Is there any reason you expect your plot names to change during the long operation of this VI?  If not, you should be using option 2.  No need to constantly reassign plot names in option 1 on every iteration.

 

If there is a chance that the plot names could change, I would put that piece of code in a case structure inside the loop that will only execute the plot name change when it detects your  "output array" has changed.  (And "output array" seems to be a poor name for a an array that contains a list of plot names.)

0 Kudos
Message 2 of 5
(1,869 Views)

Something to consider when you have code running more-or-less "forever" (I once used a U32 to save the number of milliseconds since "Start of Experiment", and defined "Forever" as 2^32 - 1 msec, which is about 7 weeks -- our undergraduate subjects wouldn't sit still that long ...) is how do you safely collect data so as not to lose everything if the program crashes (which they tend to do, sadly ...).  Have you any thoughts about that?

 

Bob Schor

0 Kudos
Message 3 of 5
(1,808 Views)

Thank you two knights!

 

I agree with you that the reference of big data is not allocated in large memory.

Actually, in my programming I ignored the reassignment of memory with reference and variable inside loop.

I was not sure if some more memory is needed in constantly reassignment of them.

But how much increase the memory.

 

Regarding "forever", I saved the data periodically and it was tested by varying of system date for a year.

 

labmaster.

0 Kudos
Message 4 of 5
(1,789 Views)

The sequence structure is overused. Possibly if an error occurs inside a sequence structure it will continue to execute to completion. Sequence structures tell other parts of the code the to stop what they're doing, I'll run my code, resume/continue when I'm done. Stops Multi threading.

 

They're multiple loops in both case, I cannot tell the program flow at first glance. If you use the error cluster for program flow, you code is less likely to cause a problem.

0 Kudos
Message 5 of 5
(1,732 Views)