07-31-2009 06:16 PM
Hello!
I have created a program that reads in a message from a RS232 connection. the instrument sends the message every 30 seconds or so.
it seems that after one reading of the message my program starts to use excessive cpu at higher than 50% levels. but this only happens when the reading case is active. in the reading case there is a file writer subvi, and in this one it shows that it only gets written to once per case, which is normal, which indicates that there is no infinite loop there, but I still experience a high CPU load.
This seems odd as I have a very very similar other program reading from rs485 that does not have this problem.
I have tried to profile memory usage and higlight execution, but this seems to show nothing.
I am using a windows Vista machine with loads of ram and cpu speed. I have attached my VI and a profiler log file.
07-31-2009 10:58 PM
Unfortunatly you have not atttached the sub-vi's so we cannot see the code.
However, TYPICALLY, (and I really hate to troubleshoot the general case) High CPU usage results from forgetting to put a "Wait for next mS multiple" in your loops.
LabVIEW will demand as much CPU time as possible unless it is throttled back. Yes it is still posible to make LV code that is optomized for CPU usuage but, LV does not default to optomized behaviour.
Try placing the "'metronome'" delay in your loops. It is found on the Timing pallate. If you look at the "help" you'll find that even wiring a 0 to this function releases the thread for other processes (and costs time to reacquire service from the uP)
Lab VIEW is a hog but, only if you let it.
07-31-2009 11:25 PM
I suggest you look over the Serial Write and Read examples in the example finder. Also look at other serial port applications posted to the forums. You have a lot of things going on that are very odd.
You set the I/O buffer to 3993. Why that number? It is very unusual. You do have a time delay in there pacing your overall loop, but 1664 is also a very unusual and precise number. Why that wait?
You flush the buffer if either 240 seconds have passed since a certain condition was met, OR if the number of bytes in the buffer is greater than 3993. The second condition can never happen because you set the size of the buffer to 3993. To have 3994 or more bytes come in will overflow the buffer, generating VISA errors, and you will still have only 3993 in the buffer.
That certain condition I mentioned was when the bytes in the port equals 3993. That means when you have exactly filled up your buffer. If another byte sneaks in before you read the port, then you will overflow the port again generating a VISA error.
Why are you running an application that is trying to so precisely fill your serial port before acting on it, then needs to read all the data before the next byte comes in. One of the key ideas of having a serial buffer is that it is a "buffer". That means you provide room for the data in it to grow as it comes in, shrink as you read it out. If you don't read it out until the buffer has precisely filled up, you don't actually have a buffer.
How much data is actually sent every 30 seconds? Is it always the same amount of bytes? Does it end with a termination character? It looks like the data is sent on its own and you don't have to sent any kind of query command to the device?
There are several different ways of handling serial messages depending on the answers to the above questions. Unfortunately, what you have done so far would never be one of those ways. Fix the problems with the serial communication, and there is a chance your CPU problems may go away.
07-31-2009 11:35 PM
Nicely summed up!
Your analisys and questions exactly mirrored the ones I have after looking more ito the source.
I'll quit for a bit and let you guide the OP
07-31-2009 11:43 PM
Jeff Bohrer wrote:Nicely summed up!
Your analisys and questions exactly mirrored the ones I have after looking more ito the source.
I'll quit for a bit and let you guide the OP
Thanks. I'm getting ready to go to bed for the night, so if more questions come up, feel free to contribute.
My thoughts are that the program architecture makes no sense for how to handle a serial port. It's very likely that the serial port is overflowing. The OP should probably put an indicator on the error wire inside the loop, or put some probes on it in different locations to see if the VISA functions are generating errors.
I'm not sure if a serial port overflow would affect the CPU usage, but I think it's a definite possibility that an overflow could be generating numerous interrupts that the CPU has to handle.
08-01-2009 02:36 PM
So the instrument does send the message by itself. every 30 seconds. and the message is always exactly 3993 bytes long. that is where that number comes in. sometimes there is an overflow of bytes, so thats why I have the buffer flush there.
The specific wait time of 1664ms is the time it takes to read in the 3993 bytes.
I have checked the error messages, and there seems to be none 99% of the time.
The subvi's are simple VI that only parses the message from the serial port. it's a simple splitting up of the array in a very precise method, where each byte corresponds to a different measurement.
The high cpu loading actually comes in when I read the serial port, and stays high until there is new information in the serial port.
This is one of my first attempts at serial communication with Labview, so I am still trying to understand the functionning of this. Thank you again for all your help.
08-01-2009 03:38 PM
08-03-2009 03:16 PM
Hello!
After having tried all the recommendations you gave me, it seemed that it was still using an enormous amount of cpu. I looked a little more deeply into it and finally found that it was in fact not the serial read that was taking all the cpu, but in fact my file writing sub vi which was stuck in a infinite loop trying to find the names of all the different channels that were being read.
So I must thank you all for your valuable information, now at least I have a program that looks much better and a bit more robust.