Well, I'm going to offer an educated guess here.
Numerical data types in LAbVIEW take up a defined amount of memory. Strings, on the other hand, can vary. When performing operations on large numbers of strings (especially when creatinga rrays of strings) there are many more memory operations required than for numerical data types, as the compiler cannot know in advance how much memory needs to be allocated for each string. This results in a LOT of copying and moving in memory.
It might just be that LabVIEW allocates this memory the first time (Thus the 50 seconds) but doesn't release it immediately. Since this memory space is still reserved for the following execution, the memory operations disappear, and the execution time is massively improved.
I might be wrong, but I observed something similar in the second-last Coding challenge (Meta-word) where using a defined-length data type as an approximate representation of a string led to a huge increase in performance..... I think the inability to pre-determine the size of a string array is crippling due to the extra memory operations required during processing.
Slightly off-topic, I wish LV could make use of a limited-length string similar to what's available in other languages. I think this would nicely combine the flexibility of strings with the performance of fixed-length datatypes.
Hope this helps
Shane.
Using LV 6.1 and 8.2.1 on W2k (SP4) and WXP (SP2)