LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

master slave design: memory problem.

Having in an ordinary Master-Slave Design a slave loop which seldom runs, but that opens and reads several files and probably consumes a big amount of memory, will this slave cause trouble to the the others loops running? If yes, is there any way to solve this problem?
The other slave loops will be continually acquiring data during the whole process.

 

Regards.

0 Kudos
Message 1 of 7
(2,804 Views)

Yes, No, Maybe.

 

It all depends on what the slave loop is doing and what else is going on in the program.  What is the slave loop doing with the data is reads from the several files?  Why do you thinkg is consumes a big amount of memory.

 

It really doesn't matter if it is a slave loop or any other portion of your code, if it is doing something that could cause you to run out of memory, than your entire program will be affected.  If it isn't doing that, then your program will be fine.

 

It's kind of hard to answer such an arbitrary and hypothetical question.  If you think a portion of your code could cause problems, then it is up to you to explore and experiment and see if it does.  If you find you do have a problem, then post your VI's then and we can answer specific questions about what might be wrong and how to fix.

0 Kudos
Message 2 of 7
(2,785 Views)

(Assuming loosing data samples may be the problem...)  If you are using one while loop to run the data acq and another loop to run the file stuff...just set the priorities of the dataacq loop very high while setting the priorities of the file stuff very low.  I'm sure this is overlooking many issues, but it is where I would start.

 

Hummer1

 

Always looking for a better place to set out from.

0 Kudos
Message 3 of 7
(2,781 Views)

Hummer1 wrote:

(Assuming loosing data samples may be the problem...)  If you are using one while loop to run the data acq and another loop to run the file stuff...just set the priorities of the dataacq loop very high while setting the priorities of the file stuff very low.  I'm sure this is overlooking many issues, but it is where I would start.

 

Hummer1

 

Always looking for a better place to set out from.


 

[Set humor mode = True]

 

Cheater!

 

[Set Humor = False]

 

If the two threads are decopled from each other then a hit in one should not hit the DAQ provided it has a buffer large enough to store backlog durring any distractions in the other loops.

 

Hummer,

 

Playing with priority seldom gives a satisfactory result. I don't know if the KB article on that topic still exists but it had a line it that read somehting to the effect "You should not have to change priorities but if you are going to change priorties which you shouldn't then you would change by going to..." It made me laugh.

 

Just trying to help,

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 4 of 7
(2,775 Views)

(Set Humor mode > on)

 

Every chance I get....

 

(...mode reset)

 

Thanks.

 

What does that mean for the effectiveness of the timed loop stuff?

 

Hummer1

 

(Trying to type and pat stomach at same time...without missing a keystroke.)

0 Kudos
Message 5 of 7
(2,767 Views)

Hummer1 wrote:

(Assuming losing data samples may be the problem...)  If you are using one while loop to run the data acq and another loop to run the file stuff...just set the priorities of the dataacq loop very high while setting the priorities of the file stuff very low.  I'm sure this is overlooking many issues, but it is where I would start.

 

 


If the problem was going to be CPU usage, then I would think loop priorities would have an effect.  But the original poster didn't mention that in the rather vague question.  Only about memory usage which I think would only be indirectly connected to loop priority.  For example, if a loop that generates large datasets and adds to a queue has a higher priority as compared to the loop that consumes the queue and the large datasets causing that loop to not run as often, then the memory will fill up and the entire LV application will crash or error out.

 

But without any details about their application, it is all speculation.

0 Kudos
Message 6 of 7
(2,764 Views)
Got it ... Thanks.
0 Kudos
Message 7 of 7
(2,760 Views)