LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Trouble communicating with COM port

Hello All,

 

I am having an issue with erratic reading and logging of COM port data. 

 

The connection is as follows My computer -> USB cable -> FT232R board -> Device under test.

 

The device is set to output a message containing ADC values every 14 ms. A standard message looks like:

 

{"type": "log", "msg": "83948 -467237 -1891949 -3490623 1435677 1 0", "level": 20, "time_ms": 83948, "path": "REL_main.cc", "line": 82}

 

However, every once awhile I get a message that looks like 

 

{"type": "log", "msg": "84746 -467120 -1892005 -3490712 1435640 1 020, "time_ms": 91508, "path": "REL_main.cc", "line": 82}

 

or even 

 

{"type": "log", }

 

As you can see from the VI I have attached. I have tried to eliminate these "bad" messages by limiting the storing of the data to only messages that fit in the range of the expected byte count. The range is large because the message contains a timestamp that grows over time. 

 

Can someone help me understand why this is happening? and please let me know what the fix is? my guess is that my computer doesn't produce a stable enough clock signal but of course I'm a rookie and I could be wrong. Please help. Thank you in advance

 

 

0 Kudos
Message 1 of 18
(2,746 Views)

P.S When I use a terminal emulator like tera term to communicate with the device. I do not have any issues in recording the messages. Is there some kind of buffer tera term has but I have not implemented in my labview vi?

0 Kudos
Message 2 of 18
(2,743 Views)

Right now there are many possible reasons for unreliable logging of serial data, too many to try to speculate a lot about.

 

As a self-described rookie, here's a tip: divide and conquer.  And here's another tip: in graphical programming, neatness counts (for the sake of understanding and troubleshooting).  You don't get bonus points for making wires go zig-zagging and crossing one another all over the place.  Go, right now, into Tools->Options->Block Diagram and turn OFF the option for automatic wire routing.  It's truly a scourge that it's turned on by default.

 

So now, back to the divide and conquer thing.  Attached are some minimal mods I made.  You should recognize most of what's left as your code, just neatened up.  All the code does now is try to read from the serial port and log everything it gets.  This *divides* the problem so you can confirm whether or not you're *receiving* a sequence of bytes that looks correct.  If so, you can be more suspicious of what your code was doing to them afterward.

 

I also added error handling, including terminating the loop on an error.  Not ignoring possible errors is another important habit to develop.  I have a suspicion you're gonna see parsing errors from your "Scan from String" function.  It's awfully easy to get a format string very slightly wrong in a way that doesn't appear obvious.

 

I further accumulated an array full of the end result of all your parsing attempts.  You can look that over next to the raw file and see how well your parsing code did.

 

So again, divide and conquer.  Find ways to test out smaller parts of the overall program a little at a time.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 3 of 18
(2,723 Views)

Instead of checking the length, parse the contents.  You have a pretty clear format there of what's acceptable.

 

Here's what I mean:

Kyle97330_0-1599097374241.png

 

You might need to refine it a bit more depending on the possible contents of the message.  For instance, if any data types or values have a comma in them, it would break this parser.

Message 4 of 18
(2,701 Views)

Hi damian,

 


@damianhoward wrote:

The device is set to output a message containing ADC values every 14 ms. A standard message looks like:

{"type": "log", "msg": "83948 -467237 -1891949 -3490623 1435677 1 0", "level": 20, "time_ms": 83948, "path": "REL_main.cc", "line": 82}

However, every once awhile I get a message that looks like 

{"type": "log", "msg": "84746 -467120 -1892005 -3490712 1435640 1 020, "time_ms": 91508, "path": "REL_main.cc", "line": 82}


When the received string misses data (marked in bold red) in the middle of the string then most probably it's not a LabVIEW fault…

You are using the default LF term char (0×0A): do your messages end with a LF?

 

You should also learn about the difference of using file references vs. file paths: if you would use the file reference for WriteTextFile then you would NOT need to collect your data with a feedback node…

 


@damianhoward wrote:

As you can see from the VI I have attached.


Saving the VI with some useful default data would help even more!

Stop the VI once you receive some messages and set all values in the frontpanel to default (Edit menu!). Then save the VI and attach it again…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 5 of 18
(2,666 Views)

I made a slight mod to the vi provided by Kevin P.  I removed the parsing error input from the compound arithmetic block that stops the loop. I did this because I was only able to get 1 iteration of the loop to execute before it stopped.This happened regardless of the state of the device(actively streaming/fresh reset)

 

After performing this operation, I got the vi to run with the outcomes falling into 2 different buckets:

 

Bucket 1 : I run the vi while the device is actively streaming. I get some output but it eventually stops and errors out due to error code -1073807252. I could not find what that meant when I searched through google. The modified vi I have attached(Thanks Kevin), has the values from one of these runs saved as default.I also included the text file with the raw output.

 

Bucket 2: I keep the device under test in reset mode(push button depressed). I start the vi and before the vi times out I release the button and let the device output data. When I do this everything seems to run fine, at least for 100 seconds I recorded. 

 

Can someone shed some light on why everything seems to be okay when running from a restart and why this method doesn't work if the device is actively streaming?

 

 

 

 

Download All
0 Kudos
Message 6 of 18
(2,622 Views)

@damianhoward wrote:

Bucket 1 : I run the vi while the device is actively streaming. I get some output but it eventually stops and errors out due to error code -1073807252. I could not find what that meant when I searched through google. The modified vi I have attached(Thanks Kevin), has the values from one of these runs saved as default.I also included the text file with the raw output.


Hint: You can go to Help->Explain Error to get an explanation of your error.  In this case it is a buffer overrun error.  This happens when too much data comes in before you read it.  The first simple solution is to remove the wait inside of your loop.  You can do this since the loop will be limited by the message rate due to the VISA Read.  The other thing I would do is use the error wires to force the file to be opened/created before the serial port is configured (Data Flow!).



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 7 of 18
(2,606 Views)

One problem I see is that you have an ever growing array of strings with that feedback node.  So the longer it runs, the more elements you have in the array.  As it grows, LabVIEW will slow down as it shuffles it around to find ever larger contiguous blocks of memory to store it.  If it takes too long to move, you'll get that error number which says there is a buffer overrun.  If you key that number into "Explain Error" under help, it will tell you what it means without turning to Google.

Message 8 of 18
(2,601 Views)

@damianhoward wrote:

Hello All,

 

I am having an issue with erratic reading and logging of COM port data. 

 

The connection is as follows My computer -> USB cable -> FT232R board -> Device under test.

 

The device is set to output a message containing ADC values every 14 ms. A standard message looks like:

 

{"type": "log", "msg": "83948 -467237 -1891949 -3490623 1435677 1 0", "level": 20, "time_ms": 83948, "path": "REL_main.cc", "line": 82}

 

However, every once awhile I get a message that looks like 

 

{"type": "log", "msg": "84746 -467120 -1892005 -3490712 1435640 1 020, "time_ms": 91508, "path": "REL_main.cc", "line": 82}

 

or even 

 

{"type": "log", }

 

As you can see from the VI I have attached. I have tried to eliminate these "bad" messages by limiting the storing of the data to only messages that fit in the range of the expected byte count. The range is large because the message contains a timestamp that grows over time. 

 

Can someone help me understand why this is happening? and please let me know what the fix is? my guess is that my computer doesn't produce a stable enough clock signal but of course I'm a rookie and I could be wrong. Please help. Thank you in advance

 

 


Honestly I refuse to read code during troubleshooting efforts until one can confirm that it is not electrical problem in nature.  This sounds like an electrical noise or grounding issue.  Can you post your wiring diagram and a picture of your setup if possible? Or at least specify your electrical including your (electrical noise) environment. what is the cable run like?  Are you running the data cable next to some "noisy" stuff? any loose connections? is everything grounded? Are you sharing a ground for your reading that is also the ground for a high noise devise?  The more information about your setup the better, and then we can filter out what is and is not useful to problem solve.

 

EDIT: It is best to chase down any possible electrical gremlins before you try to fix your problem with code.  It is at least 100 time easier to fix your electrical problem rather than live with it and try to work around it in your code.

0 Kudos
Message 9 of 18
(2,599 Views)

On the one hand, I apologize for the incomplete and quickie mods I left you with.  As has been pointed out, the 'Wait' is unnecessary and the growing string array from the feedback node is problematic as well.

 

On the other hand, logging all the raw strings gave you a file with enough clues to help you solve this.  Or at least the first major part of this.  

   The first portion of bytes are only *part* of one of the device's data frames.  For your string parsing to work, you need to make sure you only try to parse full frames.

 

A subtle thing is that the *actual* termination character is not a line feed as you have it configured now.  It's really hex 03, aka ETX or "end of transmission".  The byte you see right after it all the time is hex 02, aka STX or "start of transmission."   I can pretty much guarantee that the real message framing starts with a STX byte, then all the visible printable bytes, then a line feed, and finally an ETX byte.

   I would configure the serial port to use ETX (hex 03) as the line termination character.  And then the format string for parsing should get rid of the leading ETX char (while leaving the STX char that comes next).

   It's a kinda subtle thing, and even though you could probably "get away with" treating line feed as the termination char and including both ETX and STX in your parsing format string, I'd advise against it.  Your code's idea of a full message frame really ought to match the device's, and that'll mean starting with STX and ending with ETX.

 

BTW, this framing issue is probably the reason why your experiments worked where you held the device in reset until after starting your code.  The very first attempt should have produced a parsing error on the first read because you should not be receiving an ETX as the first char.  Thereafter, you would have been stopping the program after receiving a line feed but just before the corresponding ETX arrived from the device.  So it'd sit there in VISA's serial buffer.  On your next run, that ETX would already be there so once you released the device reset, it would deliver a data frame starting with STX.   Your VISA Read would return the old stale ETX followed by all the bytes from the new data frame except the trailing ETX.  Which would arrive and sit in the VISA buffer until your 2nd read, where the same same thing would happen again.

   Like I said, subtle.

 

Attached are a 2nd set of mods, still intended as a troubleshooting aid, not as final code.  Of particular note is that I put the VISA Read in a For Loop to limit iterations and array sizes.  I also set ETX as the line termination char.

 

I'd also note that it doesn't really make sense that you parse some of your numbers as floating point then convert them into integer strings.  They should be consistent, but since I have no way to know which numbers should be which data type, I left that stuff alone.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 10 of 18
(2,586 Views)