That is a very good question.
I faced the same situation some time ago (also 30 MB files) and I had problems with the time it took for LabVIEW to read in the file. My task was to analyze the text file and look for specific text patterns. I then decided to read small blocks and analyze them one after the other in a While loop using shift register for the position index. However, since I did not know the optimal block size I did an experiment:
I did ananlyze the same file using different blocksizes ranging from 0.5 KB to 20000 KB and I found something very interesting!
On my system there is a range for optimal blocksizes. The time it took to analyze the file had a clear minimum around 50-500 KB. For smaller blocks, the time increases with decreasing block size and for bl
ocks larger than the optimum, the time increases with increasing block size.
For my task I could go from 200 seconds (too small blocks or too large blocks) to 3 seconds just by selecting a block size that is optimal for my computer (around 100 KB).
If time is important for your application, play with this and see if your computer has an optimal blocksize too. It can make a big difference!! Hope this helps. /Mikael