I have a logging application that receives ~50-150 character strings (variable size) at variable rate over ethernet, sometimes in bursts of up to 1000/sec.
I have implemented a FIFO buffer as subVI that has two main modes (In addition to (0) initialization, etc):
(1) write to buffer
(2) read all and clear buffer.
Function 1 is used in a tight loop listening on an UDP port and stuffing everything unseen into the buffer.
An independent loop reads the accumulated data asynchronously for post processing, filtering, and display of selected events. The current buffer size is 1024 entries which is sufficient unless I run it on my old '486/66.
So far, the buffer is implemented as array of strings, optimized for maximum "inplaceness" as I w
ould do it for a numeric buffer, but due to the nature of the variable string length and the way strings are handled and stored, I wonder if this is the correct thinking.
(I've been running it for weeks without problems and there is no increase of allocated memory or anything else fishy, so LabVIEW seem to manage it well)
Would it be better to implement the buffer as a preallocated 2D array of I8, rolling the strings into sucessive colums (after string to byte array) and storing the length as first element?
Are there any other alternatives that I haven't considered?