06-19-2012 12:38 PM
I have a 2D array of integer number. I need compare each element of array and determine if this number is greater than 0.8; if number is greater than 0.8 I must remain the number, else if the number is less than 0.8 I must replace the element for a cero "0".
I would like thet you do suggestions for this. Help. I think use search 1D Array, but my array is 2D....Thanks
06-19-2012 12:54 PM
the brute force method is to simply use a nested FOR loop. You place one FOR loop inside another. Use autoindexing and a Replace Array Element in the inner loop inside a CASE structure. If the number is less than 0.8 the place it, otherwise do nothing.
06-19-2012 01:44 PM
You can also use the array approach Mads suggested (the second image) in which you feed the entire array into a comparison function, convert the boolean output to 0 or 1, and multiply the original array by that array of 0s and 1s.
06-19-2012 11:52 PM
My advice would be similar to the brute force method mentioned with nested FOR loops, though I would tend to using the In Place Element structure and using the indices of the FOR loops to specify the index that needs checking. This also limits memory allocation if the 2D array is particularly large.
06-20-2012 12:06 AM
@tyk007 wrote:
My advice would be similar to the brute force method mentioned with nested FOR loops, though I would tend to using the In Place Element structure and using the indices of the FOR loops to specify the index that needs checking. This also limits memory allocation if the 2D array is particularly large.
While the in place element is a good suggestion there is no benefit or advantage to using the loop index instead of using auto indexing. Is there a reason you believe this is a better approach?
06-20-2012 10:40 AM
The replace array subset also works inline, so there's really no need to use up the extra diagram space by using the in-place element structure.
06-20-2012 03:12 PM
Some good points ... I have looked at something similar before, and discussed it with our local NI support. They lead me to believe that there would be a small performance boost using the In Place structure in this case, largely depending on the size of the array. I figure that it would need to be really large to see any real performance benefits, but the OP hasn't specified any values.
If I get time I mgiht try out a few benchmarks and ascertain this for myself.
06-20-2012 03:52 PM
I posted without remembering but now I recall - the In Place Element structure will minimise additonal memory allocation hits. If one has a large 2D array (say 5000x5000) then this can make a huge difference to the memory hit. A basic performance benchmark on my PC puts an immediate 256MB additional heap allocation for a Replace Subset, but a negligible (read bytes) difference in using the In Place Element structure. The averaged execution times between the two methods is about the same as everyone has suggested. if I increase the array size to 10000x10000 then LabVIEW runs out of memory with the additional allocation for Replace Subset but happily runs with In Place.
I remember this being the reason why I resolved to use the In Place Element structure for any large or unknown size in place modifications, despite its unwieldy appearance in a block diagram.
All this is beyond the OPs original intent no doubt but might help someone's future searches of the forums.
06-20-2012 03:56 PM
Can you share the code that produces these results? Do you know in which version you tested it? Suggesting that an in-place element is more memory-efficient than replace array subset for replacing elements inside a for loop goes against years of experience and recommendations on this board. I wonder if there was something unusual about your test that was more efficient in that particular situation but does not apply more generally.
06-20-2012 04:16 PM
Yeah, I wish I could I can't upload anything to the forums due to some proxy restrictions which is why I didn't originally. I don't think there was anything special about the test itself, but it does seem like NI is pushing the technique in general from the lin below (unless I've read this completely wrong, please let me know if I have). It is still a case of choosing the best solutions, as always. I don't think there is any gain for normal low memory data-sets.
http://www.ni.com/white-paper/6211/en
I only did the Core 1/2/3 training last year, but our trainer was pushing the In Place Element structure hard then.
Sorry for effectively hijacking the thread; maybe this can be pushed elsewhere, the orignal OP got their answer.