03-29-2022 05:40 AM
Hello, I have a program that does a serial read from a source that sends messages of 75 bytes (size is variable) with a termination char "Á" and every 100ms.
I have a loop where I read the serial port and a second loop via global variables decides to process that data. One of the process is to check if messages are sent every 100ms +-20%. In case 5 consecutive messages do not meet the requirement SUBVI ends because it decides communications are lost (is a rule from project, cannot change that). I also have a time out of 500ms that will trigger if no messages are received in that interval and will consider communications are lost.
In my situation I can confirm my current source (used for testing program) sends messages every 99-101ms but when read, intervals go from 95-105 ms which is not desirable and would like to reduce. This is still acceptable and I have margin to detect when source is going outside that 100ms +-20% tolerance (when source is unkown). Program usually when gets the termination character reacts quickly and reads the message, with some variable ms delay and that is probably why intervals go from 95-105 ms.
What is really a pain and I have not been able to solve is that rarely termination char received but visa read "freezes". This makes the read delay ms (even few seconds) and messages accumulate at the port. Then, visa reacts again an reads what is in the port, which are bunch of accumulated messages received. Sometimes time out is triggered before that and sometimes happens that 4 messages (400ms) are not read while that "freeze" it reads them instantly when unfrozen and decides interval is not met for those 4 mesages (way less than 80ms) and considers communications are lost. This is something I should fix but I have desperately not been able to, that is why I am looking for help.
My theory is usually Labview gives Windows an interruption in order to read the port and that is what the variable read times come from, because Windows answers request when decides it is proper to do. This usually is answered in few ms. But sometimes, for reasons I cannot tell, Windows gets really busy (could not identify any cause of that, is not in the program) for hundreds of ms or even few seconds and cannot answer the interruption Labview has given, making mess all data interval record and checks. I have set priority of Labview and VI to the maximum and helped quite a lot, but I still have the problems mentioned before.
Is there a way to set the VISA READ priority extremely high, above everything else? Or is there a way to have a Real Time VISA Read ? I have tried with a Real Time Loop but does not fix anything.
I cannot attach the real file but I have attached a simplified copy. Do not bother in details as It is not the real program and is just to show how superficially works. Also I attached a log file of the messages received with the time of every message too and those "freeze" events happen.
Thank you a lot in advance, any advice or tip helps me a lot as I am really desperate trying to fix this and feels extremely frustrating at this point.
03-29-2022 07:11 AM - edited 03-29-2022 07:12 AM
Don't decode the message in the same loop that you are reading. It will hold up the loop until it is finished processing the message. Look at Producer/Comsumer design pattern where all the producer will do is grab the messages; it is the job of the consumer to decode and process them.
03-29-2022 08:42 AM
@Àlex wrote:
What is really a pain and I have not been able to solve is that rarely termination char received but visa read "freezes". This makes the read delay ms (even few seconds) and messages accumulate at the port. Then, visa reacts again an reads what is in the port, which are bunch of accumulated messages received. Sometimes time out is triggered before that and sometimes happens that 4 messages (400ms) are not read while that "freeze" it reads them instantly when unfrozen and decides interval is not met for those 4 mesages (way less than 80ms) and considers communications are lost. This is something I should fix but I have desperately not been able to, that is why I am looking for help.
That is just a normal Windows issue. Windows randomly takes over and does stuff all the time. So you cannot trust timing measurements when it comes to applications running on Windows.
@Àlex wrote:
I have a loop where I read the serial port and a second loop via global variables decides to process that data.
Stop that! Use a Queue to pass the data to the processing loop. This way you can make sure you process all of the data.
03-29-2022 08:47 AM
This would exactly be the kind of application that requires an RTOS that can prioritize tasks and complete them in the required time.
When running something on a typical Windows system, you cannot guarantee an execution time as it is multi-tasking and you don't have control over the OS
03-29-2022 10:03 AM
@santo_13 wrote:
This would exactly be the kind of application that requires an RTOS that can prioritize tasks and complete them in the required time.
When running something on a typical Windows system, you cannot guarantee an execution time as it is multi-tasking and you don't have control over the OS
True enough, but with the proper architecture, I believe the OP can achieve their definition of "good enough".
03-30-2022 01:56 AM
I will try to apply your advice and crossrulz one and hope it improves a bit.
Thank you for your answer, really appreciated!
03-30-2022 01:58 AM
Do you think with this program in a CompactRIO the problem would not happen? The end goal is to translate that to NI Hardware so I would not bother about this issue a lot if it is solved in the near future.