LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to best apply a FIFO buffer to serial port data?

Solved!
Go to solution

Hello

 

I think I am missing something very basic, so please forgive my ignorance. All I want to do is establish a simple FIFO buffer that has a quality control logic sequence prior to data display onscreen. I cannot figure out the best way to continually loop through the incoming data.

 

The attached vi should explain most of it. My instrument streams 25 bytes a second through the serial port. I have heard it's better to put the VISA Read outside the loop but I cannot get my vi to continuously read new data if I do that (it stops at specified buffer length).

 

Is a possible solution to properly use a feedback node?

 

Thanks for the help. I'm using LV2009.

0 Kudos
Message 1 of 5
(5,761 Views)

You have a FIFO, as in the comport buffer that you set the size to 4096

 

Set the termination of your comport read to 0x04. The VISA read will read up to the 0x04 and return the byte up to and including the 0x04. Leaving any other bytes in the comport FIFO.

 

Set the time out on the com port to something greater than the time it take to send one full stream of data.

That could be the 1000 ms that you have in the first frame. Set the time out to ...I don't know ... let that say 2000 ms

 

Delete the first frame delay and go striaght a VISA read with the bytes to read set to something far greater than what you are looking for.

You have now have 75 bytes to read wired. If the lenth of your data that you are looking is 25 then set the bytes to read to something greater ... I don't know... let say 50 bytes.

 

If the termation is correct the VISA read will go out and wait at lest 2 sec and 50 bytes looking for the first 0x04 and return only up to the 0x04 leaving eveything else in the FIFO until you loop around to read again. You can then check to see if the first time around you got the "0119". If the bytes read was less than 25 then you jumped in in the middle of a message and missed the begining and most likly did not get the "0119". The the next time around and from then on you should get the full 25 bytes with each read.

Omar
0 Kudos
Message 2 of 5
(5,756 Views)
Solution
Accepted by topic author CodeMunkee
Message 3 of 5
(5,746 Views)

Thanks for the clarification on the serial port buffer - very helpful. It seems the termination character 0x04 works and the subsequent logic sequence to check for a valid data packet also works. Unfortunately, reconfiguring this way resurfaces another dilemma - the one that I side-stepped by adding the 1-second delay.

 

My instrument is very slow (25 bytes of data per second) and LabView is very fast. When I don't add a 1-second wait to the code, data streams to the screen so fast it's unreadable (via charts and graphs) and difficult to analyze. When I add the 1-second wait (or anything incrementally smaller) then the wait seems to interfere with the valid data packets/buffer, resulting in a dropped data packet once every 10-20 seconds or so.

 

If I plug my instrument directly into HyperTerminal then I watch 25 bytes/sec. When I configure this loop in LabView I have data drop outs. Why is this so? Is there a better way I can slow LabView down because this piece of code seems to perform perfectly apart from operating too fast! (bet you don't get that complaint very often)

 

Let me know if I should get more specific. Thanks!

0 Kudos
Message 4 of 5
(5,736 Views)

How many bytes are read each time from the VISA read? 25?

 

When you start your VI, has your instrument been running already sending out data? If so there could already be up to 4096 bytes of old data in the comport buffer that the LabVIEW VI will read in as fast as it can. Then once it works its way through the old data, then it should slow down and match the speed of your instrument.

 

Put a "VISA Flush I/O Buffer" vi before the while loop to clear out all old data.

 

You want LabVIEW read and process the message as fast as it can and then to wait on your instrument by either a time out setting or checking for bytes at port.

If you Use a time delay set to 1 sec to try to sync up with 25 bytes /sec your are sure to miss some bytes evey now and then.  

Omar
Message 5 of 5
(5,730 Views)