From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Termination of the VI after about 20 seconds of accepting values from a serial port transmitting continuous data.

Solved!
Go to solution

Hi,

 

I am using a data acquisition unit inbuilt in a Hairball 2 unit installed with a Zilla Controller on an electric vehicle. The hairball lets other devices communicate to the unit via a serial port RS-232. The DAQ option to displays information like speed, voltage etc.

 

I use Teraterm, a terminal communicator to start the communication with the hairball device. After navigating through the various menus, I can access the DAQ feature which provides continuous data of the status of the various components of the vehicle. 5B 01 0B C8 03 53 02 39 27 OMFS for example is a sample data from the DAQ module.  

 

When the serial port is configured to send out such data, I close the Teraterm program and run the LabVIEW vi (Serial Read and Write to .tdms only.vi) which automatically stops in around 25 seconds (thereby recording 25 readings). It gives an error (-1073807252) saying that VISA:  (Hex 0xBFFF006C) An overrun error occurred during transfer. A character was not read from the hardware before the next character arrived.

 

I am unable to change the output from the DAQ device. In fact, the DAQ device has no controls what so ever. So all the adjustments required needs to be done in LabVIEW.

 

Is there any solution to this problem? What could be the likely causes?

 

Thanks for your help in Advance.

 

Best Regards,

Akhil Kumar Meesala (Mr.)

Year 4 | Undergrad | Mechanical Engineering

National University of Singapore

Email: akhil@nus.edu.sg

Mobile: (+65) 9326 7069

0 Kudos
Message 1 of 6
(2,582 Views)

Hi Akhil,

 

"What could be the likely causes?"

Probably it's the 1000ms-Wait in your main loop.

Why do you wait for 1000ms and additionally check for "Elapsed time" with 1000ms???

When you device sends a lot of data and you only check for data each 1000ms buffer overflows may occure...

Possible solution: separate loops for serial data reading and writing to file! (Of course with a proper coding scheme as shown in the LabVIEW examples!)

 

 

P.S.:

Why did you open a new thread instead of sticking with your original one? This might get confusing when people try to help on the same topic scattered over several threads...

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 2 of 6
(2,571 Views)

Hey GladW,

 

Thanks for the prompt response. Actually both were different problems faced with the same program, so I though of starting a new thread 🙂

 

Anyway, the DAQ device is sending outputs at a steady 10 lines per second. I can't change that. So are you trying to say that if I increase the rate at which I accept the data, (say every 500ms) the problem should go away?

 

Is there anyway to "refresh" the buffer after every reading taken? That way, it would atleast not store the unwanted readings its getting from the DAQ.

 

How can one increase the buffer?

 

Also would a faster computer running the program help?

 

I'm a bit new at this, so I'm still figuring out my way. 🙂

 

Thanks.

 

Best Regards,

Akhil Kumar Meesala (Mr.)

Year 4 | Undergrad | Mechanical Engineering

National University of Singapore

Email: akhil@nus.edu.sg

Mobile: (+65) 9326 7069

0 Kudos
Message 3 of 6
(2,551 Views)
Solution
Accepted by topic author Akhil Kumar

Hi Akhil,

 

I would really appreciate if you would type my nick correctly - shouldn't be so hard to copy&paste 5 letters Smiley Wink

 

"So are you trying to say that if I increase the rate at which I accept the data, (say every 500ms) the problem should go away?"

Yes. The more often you read from the buffer the less likely you will get a buffer overflow...

 

"Is there anyway to "refresh" the buffer after every reading taken? ... How can one increase the buffer?"

Have you looked at the functions in the VISA -> "VISA Advanced" -> "Bus/Interface Specific" palette? It's all in your face Smiley Wink

 

"Also would a faster computer running the program help?"

Definitely NO! Any computer nowadays is fast enough to handle standard RS232 serial communication...

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 4 of 6
(2,534 Views)

Hey GerdW,

 

I am really sorry, must be half sleepy when I typed that (twice!!).

 

Thank you for your input. I would look into the VISA properties.

 

Thanks again for all your help.

 

Best Regards,

Akhil Kumar Meesala (Mr.)

Year 4 | Undergrad | Mechanical Engineering

National University of Singapore

Email: akhil@nus.edu.sg

Mobile: (+65) 9326 7069

0 Kudos
Message 5 of 6
(2,529 Views)

Dear GerdW,

 

I am using the VISA Set I/O Buffer Size Function to set the buffer size. However I am not too sure of the meaning of the mask variable: designates which buffer size to set.

16 I/O Receive Buffer
32 I/O Transmit Buffer
48 I/O Receive and Transmit Buffer

 

In my present case, I am only recieving values from the serial port and not transmitting any data to the DAQ device, so should the mask just be 16? I just want to confirm.

 

Also, how do we determine the required size of the buffer? Can I just put an arbitary large value like "99999" (I apologise for my ignorance when it comes to such things)?

 

Thank you for your help. There has been some malfunction in the car in which I had to conduct the experiments, so I cannot check the software at the moment.

 

Best Regards,

Akhil Kumar Meesala (Mr.)

Year 4 | Undergrad | Mechanical Engineering

National University of Singapore

Email: akhil@nus.edu.sg

Mobile: (+65) 9326 7069

0 Kudos
Message 6 of 6
(2,510 Views)