LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

best way to transfer of an array of data via a network stream

This is in regard to an issue of behaviour that looked like a memory leak when transferring an array via network streams:

 

In document 6HK6CHU6 it is recommend when transferring 2D arrays via network streams to combine an initialization with a 1D array (as the element) when creating the stream and then doing a multiple element write;

 

If sending a 1D array, is it better to similarly initialize with a scalar and then do a multiple element write or initialize with a 1D array combined with a single element write?

 

For example if I want to transfer an array that will have 108,000 U32 values, i.e. 432KB, is it really more efficient to do 432,000 indiv. writes (with the multiple element write vi), i.e. isn't there an overhead cost associated with doing that many individual writes vs. 1 write of an array of the same size?

0 Kudos
Message 1 of 4
(3,161 Views)

aetc,

 

The knownledge base document just suggests to break up the information when sending a large 2D array.  I suppose if your 1D array is large enough you may want to break it down into smaller pieces but im sure there is a point of diminishing return.  The KB does not specify a size limit at which you should begin breaking your data into smaller pieces.  I suspect this is network dependent but I dont know for sure.  

 

http://digital.ni.com/public.nsf/allkb/805F26E9E9752B2686257C670047DC17

 

- John

0 Kudos
Message 2 of 4
(3,117 Views)

Thanks for the input. I submitted this to NI and have gotten both Y and N responses, i.e. that  one should init. with a scalar and do a multiplie element write, but also that the case in question was specifically with regard to a 2D array so that for transfer of a 1D array one could do an init. with a 1D array with a single element write, i.e. there does not seem to be a clear reason to go either way.

0 Kudos
Message 3 of 4
(3,112 Views)

I guess the real question is "does your target crash from a lack of memory?" 

 

If it does then I would look into splitting up the data.  You might even do some simple tests with random data to figure out how big is too big.  

0 Kudos
Message 4 of 4
(3,101 Views)