01-18-2007 02:13 PM
01-18-2007 04:12 PM
01-18-2007 04:15 PM
01-18-2007 05:23 PM
kehander wrote:
...
My question is, how can I avoid using Array Subset? I need to do an operation on an array in my program, remove some points from the array, and then perform the operation again.
...
(1) Well, "array subset" only needs to allocate space for the new subset array, it does not touch the original array. How big are your "subsets"? How are you using "array subset" to delete certain elements from an array? Can you explain?
(2) What kind of "operations" are performed?
How big are your arrays? Can you give some typical example. How many elements are removed at each iteration? Are they randomly distributed over the entire array or do they form a contiguous range?
01-18-2007 09:42 PM
01-18-2007 11:27 PM
For array sizes of 50-100, don't worry about memory issues! It's all peanuts. 😉
For huge arrays it might be worth to code your own histogram function and do the rejection test in a loop where you would skip bad values and increment the correct bin for good values.
What histogram function do you currently use, there are at least three? Histogram, General histogram, create histogram(express)).
If you would use the General Histogram, you could specify the upper and lower limits, thus outliers beyond those limits would be dropped automatically. There is even an output with information about dropped values.
01-22-2007 01:29 PM
@altenbach wrote:For array sizes of 50-100, don't worry about memory issues! It's all peanuts. 😉
What histogram function do you currently use, there are at least three? Histogram, General histogram, create histogram(express)).
01-22-2007 02:49 PM
I'm curious,
If there is a definite need to do an operation on an array that requires only a partial of that array, then why not use "array subset primitive function". It seems like the proper tool for the job. Delete from array was the function I had chosen for an earlier program until I had done some timing statistics on just those two operations (speed only) and have since replaced ALL delete from array functions with get array subset functions where applicable. Please let me know if my judgement is in error.
There is alot of discussion about data copies in memory everywhere (which I am currently studying when and where and amount). I find my mind wandering to the premature conclusion that although it is necessary to minimize data copies, why not accept the fact it is going to happen and commit to programming around this by perhaps chuncking all data processing in much the same manner it was written by an acquisition device. Why do I feel the need to completely read the file... Manipulate the file.... create massive burden on system resources... make the user interface and computer unresponsive... all to put it in a graph or indicator wicth it does'nt fit into nicely. I know I am just a LabVIEW adolescent, but I want to learn. The LabVIEW paper on managing large data sets is mandatory reading if one is concerned with memory issues. If there are any more links that someone thinks I should read....please, i am but a baby in this world. Please let me know if my judgement is in error.
Chris Co