From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Help needed: Memory leak causing system crashing...

Hello guys,

 

As helped and suggested by Ben and Guenter, I am opening a new post in order to get help from more people here. A little background first...  

 

We are doing LabView DAQ using a cDAQ9714 module (with AI card 9203 and AO card 9265) at a customer site. We run the excutable on a NI PC (PPC-2115) and had a couples of times (3 so far) that the PC just gone freeze (which is back to normal after a rebooting). After monitor the code running on my own PC for 2 days, I noticed there is a memory leak (memory usage increased 6% after one day run). Now the question is, where the leak is??? 

 

As a newbee in LabView, I tried to figure it out by myself, but not very sucessful so far. So I think it's probably better to post my code here so you experts can help me with some suggestions. (Ben, I also attached the block diagram in PDF for you) Please forgive me that my code is not written in good manner - I'm not really a trained programmer but more like a self-educated user. I put all the sequence structures in flat as I think this might be easier to read, which makes it quite wide, really wideSmiley Tongue.

 

This is the only VI for my program. Basically what I am doing is the following:

 

1. Initialization of all parameters

2. Read seven 4-20mA current inputs from the 9203 card

3. Process the raw data and calculate the "corrected" values (I used a few formula nodes)

4. Output 7 4-20mA current via 9265 card (then to customer's DCS)

5. Data collection/calculation/outputing are done in a big while loop. I set wait time as 5 secs to save cpu some jucie

6. There is a configuration file I read/save every cycle in case system reboot. Also I do data logging to a file (every 10min by default).

7. Some other small things like local display and stuff.

 

Again I know my code probably in a mess and hard to read to you guys, but I truely appreciate any comments you provide! Thanks in advance!

 

Rgds,

 

Harry

Download All
0 Kudos
Message 1 of 23
(2,910 Views)

Wow, what a monster. Let me guess: you are an experience text based programmer!

 

There is no way I can work with this, because I am on a laptop, but your code is at least 8x too large and you have 8x duplicate code for everything. I would suggest to use arrays with 8 elements and autoindexing loops to do the repetitive work. Entire screeens of your code could be replaced by a simple "in range and coerce" operation.

 

Massive overuse of local variables certainly causes more memory than really needed, but it cannot be the real problem.

 

How big is the 2D array in the intensity graph?

 

Do you really need to configure all you DAQmx tasks from scratch with every iteration?

 

What is the actual memory use? (You only mention percentage increase)

Message 2 of 23
(2,896 Views)

Altenbach,

 

Just like Ben warned me "So [you should] start a new thread and "gird up your loins". You will be beaten about the head and shoulders verbally " Smiley Happy  And I don't even think I am qualify to be called "a programmer" at all 🙂

 

Thanks for you suggestion, I'll look into the "in range and coerce" operation to see if i can get my monster slimmer. But to answer your question:

 

1. The array in intensity graph has 101 elements

2. In terms of memory usage, it increased from ~44MB to ~48MB after 24 hrs.

 

BTW I hate to use local variables all the time either. How can i avoid that?

 

Harry

0 Kudos
Message 3 of 23
(2,885 Views)

Well, I'll at least give you points for neatness. However, that is about it.

 

I didn't really look through all of your logic but I would highly recommend that you check out the examples for implementing state machines. Your application suffers greatly in that once you start you basically jumped off the cliff. There is no way to alter your flow. Once in the sequence structure you MUST execute every frame. If you use a state machine architecture you can take advantage of shift registers and eliminate most of your local variables. You will also be able to stop execution if necessary such as a user abort or an error. Definitely look at using subVIs. Try to avoid implementing most of your program in formula nodes. You have basically written most of your processing there. While formula nodes are easier for very complex equations most of what you have can easily be done in native LabVIEW code. Also if you create subVIs you can iterate over the data sets. You don't need to duplicate the code for every data set.

 

I tell this to new folks all the time. Take some time to get comfortable with data flow programming. It is a different paradigm than sequential text based languages but once you learn it it is extremely powerful. All your data flow to control execution rather than relying on the sequence frame structure. A state machine will also help quite a bit.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
Message 4 of 23
(2,875 Views)

As Altenbach said, you really shouldn't create and clear each channel after each use. Instead, I'd create the channels, start them, put them into an array outside the loop and then access them and write in each loop iteration and finally clear stop them after the last loop has finished.

 

Opening and closing a resource so many times might be the cause of this because this could seriously trash the system if it isn't too good at handling this (i.e. if there's some bug in DAQmx this could magnify it).

 

To make your code easier to read I'd make them more compact, i.e. remove all the white space between controls and the surrounding structures (loop, case structures...), especially in the vertical direction.

 

I'd also point you to the following resources which could help you profile the system as it's running and maybe see what exactly happens before a freeze: http://support.microsoft.com/kb/305610 and http://technet.microsoft.com/en-us/sysinternals/bb896645 The second one will let you filter so that you could only display events initiated by the LabVIEW exe. However, you'll have to keep it displayed on screen because a live log isn't saved so when it freezes no log will exists so you'll have to look at the screen to see what occurred recently.

 

I might also save a clocked log after each major operation in LabVIEW so that after a freeze you can see what happened last.

 

Good luck,

M

Message 5 of 23
(2,868 Views)

@myle00 wrote:

To make your code easier to read I'd make them more compact, i.e. remove all the white space between controls and the surrounding structures (loop, case structures...), especially in the vertical direction.




Don't remove all white space and cram everything together. That can be just as unreadable if not worse than excessive white space. You need to find a balance between an appropriate too little and too much white space.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
Message 6 of 23
(2,862 Views)

It's structured and a solid foundation and an excellent candidate for a state machine! Basically each frame would make a state and it'd be alot easer to follow.

 

Since the controls and indicators seems logically connected i'd suggest making them into Clusters which you can easily send between states through a shift register.

 

However, back to the original question, I'd bet it's the constant recreation of DaQ-tasks that's slowly eating memory. You should create the tasks as initialization, as a AE (action engine) and then you read from and use that instead of creating new ones in the structure.

 

Locals create data copies, but i dont think it creates new ones for each loop, so it shouldn't be the reason here.

 

Also, Sub-vi's are good. 😉

 

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
Message 7 of 23
(2,844 Views)

I'm not guessing.

DAQ-mx Leaks when you Create/Destroy any task.  In some versions the 4096th time you create a task you will error out.


"Should be" isn't "Is" -Jay
Message 8 of 23
(2,827 Views)

Mark,

 

Thanks for taking time replying my questions (so entry level and probably looks hopeless to you). Smiley Very Happy

 

I have to admit that each of your answers opens a whole new topic to me. Originally I wanted to do something "quick n' dirty" - as long as it runs I am good. I am a researcher type of guy, not even familiar to the word of "programming". The thing is when software team pushed the ball to us, we did not say no - and this is our cost I guess... 

 

Anyhow, I plan to sit down and do some study if I can survive from the endless meetings...  Thanks again!

 

Rgds,

 

Harry

0 Kudos
Message 9 of 23
(2,825 Views)

Myle00,

 

Thanks for your suggestion. I'll do some study and see if I can make it better.

 

Harry

0 Kudos
Message 10 of 23
(2,821 Views)