I need to read a message from a serial port and I'm having some problems. I used the advanced serial communication example and came up with the attached program. Everything works perfectly fine except for the fact that if no message is sent for 10 seconds as designed the serial read outputs an error. I know normally I can check to make sure there are bytes at the port before I do a read operation which solves this problem. However, when using the buffer to terminate the read once a termination character is provided the bytes at port property always returns 0 so I can't use this as a conditional for reading the port.
Is there a way to prevent the serial read from timing out in the attached program? Thank you.
Why are you setting the termination character and the buffer size with each iteration? These only need to be set once and '0' is a very odd termination character. With a case statement around the VISA Read and a comparison of the available bytes is equal to 0, your timeout error goes away.
Thank you for the reply, I appreciate the help. The terminator as I understand it is not 0 but a CR. I attached a new VI based on how I understood your instructions which works better.
But this has 2 problems. Since it only reads once atleast one byte is present the first byte seems to be dropped from the output. So if I type "hello" I get "ello".
Second, if I start typing and don't hit enter for more than 10 seconds it still times out. Is there a way around this?
Sorry, totally misread the term character.
You should be using a shift register to hold the string. I still don't see why you have the Set I/O Buffer Size in the Loop and I don't see how you could still be getting a timeout if the termination character is actually a CR. Perhaps this is incorrect. Also, why even have the extra property node for this? Just using the VISA Configure Serial Port is enough to set the term character.
I was going off the example provided by NI and must have misunderstood what the serial buffer is there for. If I remove it from my code it works exactly as it should. I'll also get the property node stuff cleaned up, that also came from the example.
Thank you so much for the help, looks like I am good to go.
Actually looks like I spoke to soon, I could have sworn it was working. I am still having the problem that if I type a character and don't hit enter (the termination character) it will time out after the 10 second default time out is passed. Any ideas?
I have no idea where you are typing from. Is this program sending data anyway without the CR you provide. If you are seeing bytes and the receiving end, that would be the only thing that makes sense. Bytes at the port but no termination character would give you a timeout when you are specifying a large number of bytes like you do. The solution would be to wire the number of available bytes to the VISA Read instead of using a constant.
Let me maybe give a bit more details about the application. I want to be able to get any amount of characters as a string that someone inputs in to the serial port. This will be a computer so I can expect the commands to usually be valid but I want to provide some additional redundency incase a person at a terminal is writing commands. The commands might be "measure 110" or it might be "set 23 34". So the amount of bytes being sent will depend on the command. I used 1000 as the byte count on the read just to provide as much room as possible, setting it to 100 doesn't really make a difference in this application, the same thing happens. If someone starts typing in a terminal "measure 110" (for example) but forgets to hit enter for the termination character the program will time out since the error cluster is connected to the stop button of the loop. When this happens the entire program needs to be restarted. Not really ideal.
If I disconnect the error cluster from the stop button I noticed it works perfectly fine. After 10 seconds of the terminator character not being entered the buffer clears and the loop simply restarts. What I worry about is if there is some other fatal error (such as not being able to connect to the serial device) that the program will keep running. Is this something I even need to worry about?
Attached is a screenshot of my latest program which works great but like I said I'm just worried about not wiring the status from the error cluster to the stop button since that seems to be what I've always been taught to do. If I wire the byte count to the number of bytes property it will only collect the number of bytes I set, since this will vary based on the command that won't really work in this application.
First, if you wire the number of bytes to the VISA Read, it will read each and every byte that is sent. As long as you disable the termination character, the only thing that you will get is a warning.
If the termination character is enabled and this 'terminal' sends data without the user hitting enter, then perhaps you want to get a timeout error. There is no reason however, for the program to stop if you properly handle the error. A state machine architecture would be better than your simple loop. You can check the error code and branch to different states depending on the code.
I have the same problem with you. I cant read my first character. For example, I should read "hello". But I only see "ello". You didnt say how to solve this. Have you figured it out? Can you share me the solution? Thank you very much.