02-16-2018 06:20 AM - edited 02-16-2018 06:22 AM
Hi shaun,
i observe that this error happens when data on my table gets to bigger .
That is the clou I was waiting for!
To display a lot of data in a table needs some time - more than you might think of. This can slow down your loop and so lead to buffer overruns.
Solution: use a producer-consumer scheme to decouple DAQ loop from display loop…
i never use a producer consumer loop ,so can u give me an example ?
LabVIEW comes with ahuge library of example VIs (in the example finder) and predefined projects (file menu-> "new…").
One of the example projects is a basic producer-consumer scheme!
i have to change my code right ?
You have to change your code anyway to get rid of the error!
i am receiving data than convert it to hexadecimal string than parse it accordingly . from that hexadecimal string i subset fewer bytes than convert it to decimal so my output is a mix of binary and hexadecimal ..
It sounds wrong to me when you need to convert data bytes to hexadecimal formatted string(s) to be able to get their decimal value…
02-16-2018 07:33 AM
@shaunmarsh123 wrote:
i never use a producer consumer loop ,so can u give me an example ?
02-16-2018 08:08 AM - edited 02-16-2018 08:08 AM
Now that found some time to look at your code...
This is one of the times that I am going to recommend you do NOT use a State Machine. It just adds complication. You can use simple case structures inside of each other and see all of the code.
Since you have binary/hex data (not ASCII formatted), you should turn the Termination Character off. If one of the data bytes just happens to be 0xA (the default termination character), the VISA Read will stop there and you will not get all of your data.
I also really question your parsing of the data. It seems like you could use Unflatten From String to simplify that code A LOT.
So here is the example I put together for you. No local variables needed!

02-16-2018 08:13 AM
How fast is this code running? How long does it run and how big does the table get before you get the error?
My impression of your code is that it is running fast and building up the table quickly. Is there any limit on how larger you let your table grow? An ever growing array of data will cause problems because the PC will need to allocate ever larger blocks of memory to hold it and it takes time to shuffle that data around.
The solution is to not create ever growing arrays. (And when instead of using Insert into Array, you should be using Build Array any time you know you are adding data to the beginning or end of the array.)
02-16-2018 09:37 PM
i am working on your suggestion .. and will let you know how it works.
02-16-2018 09:41 PM
table get 3000 rows filled before i got this error.
so what should i do to not fill ever growing array of data . like how can i fix amount of data filled in my array ? like when new data comes the oldest entry gets delete .. is there anything i can do to acheive this ?
02-17-2018 05:41 AM
hi crossrulz ,
your producer-consumer idea works fine.
it took me some time to completely shift my code to this but know it works fine without any errors.
thanks 🙂
02-17-2018 10:25 AM
@shaunmarsh123 wrote:
table get 3000 rows filled before i got this error.
so what should i do to not fill ever growing array of data . like how can i fix amount of data filled in my array ? like when new data comes the oldest entry gets delete .. is there anything i can do to achieve this ?
Of course. Just check the size of the array and delete the first row from the array when it gets too big.
03-28-2018 04:58 AM
Hello CrossRulz,
I want to scan a barcode using a scanner. The scanner is waiting for barcodes to come one by one, how can I use the example shown by you in my case.
When I use simple serial data read or when I used your example ( I am stopping the loop when byte counts become 11) but sometimes I get the same error "(Hex 0xBFFF006C) An overrun error occurred during transfer. A character was not read from the hardware before the next character arrived", how can I avoid this.
Thanks
James
03-28-2018 05:42 AM