03-08-2005 09:40 AM
03-08-2005 10:06 AM
03-09-2005 03:02 PM
03-10-2005
07:51 AM
- last edited on
11-25-2025
12:20 PM
by
Content Cleaner
Dealing with that amount of data will require you to pay attention to data copies for best throughput. Check out Managing Large Data Sets in LabVIEW for some tips. You will definitely want to optimize your read and write chunk sizes (and they will probably be different). For Windows (FAT, FAT32, and NTFS file systems), the optimum write-to-disk chunk size is about 65,000 bytes, just under the 64kByte boundary. Your mileage may vary on RT systems.
If you need different chunk sizes for optimum performance, my approach would be to set up a ring buffer using a LabVIEW 2 style global. Feed it with the data input at one chunk size. Stream to disk from it at a different chunk size.
Good luck.
03-10-2005 09:17 AM
03-11-2005 07:52 AM
03-16-2005 09:45 AM
03-17-2005 08:09 AM
03-18-2005 02:10 PM
03-21-2005 08:14 AM