03-10-2009 04:22 AM
Dear NI,
I am acquiring the 32 channels of Strain data.. about 1000 Samples/sec. i want to store the all 32 channels of dynamic data ( all samples) for the test of 30mins. which file format that can be utilize for this data logging.?
After, in offline i need to retrieve the channel wise data and to display in graph....
03-10-2009 04:28 AM
I would recommend the TDMS file format. Have a look at the following article:
03-10-2009 04:30 AM
Balaji,
since the throughput is not too high, you should ask: what application do i want to use in order to represent/analyze the file later on?
Possible formats (suggested) are:
1. TDMS
2. Binary
3. Spreadsheet
Hope this helps,
Norbert
03-10-2009 04:57 AM
Dear NI,
Supposed i am using DSA card, we know that that can be possible with 100ksample/se. in these case what you suggessted for dynamic logger....?
03-10-2009 05:06 AM
Balaji,
the bottleneck in such cases is, provided proper programming, most often the access to the harddrive. Normal harddrives support something up to 60..80 MB/s depending on the amount of data already on disk.
However, the amount of data written to the disk per timeframe (s) is given by the system, so you propably have to reduce data before writing to file.
Spreadsheets (ASCII) in general waste a lot of memory because each number is represented by many symbols.
Binary is the optimum. Each number has certain data width and there are no delimiters.
TDMS saves adds information to the data. Data is stored binary, so very efficient. So if you do not add too many information or only once in the beginning, TDMS is very comparable to binary files in regard to performance. Since you add additional information, the data could be presented in a better way later on though.
Conclusion:
Use TDMS or binary files if you have to optimize the process of writing to files.
hope this helps,
Norbert
03-10-2009 05:41 AM
I tried to write the 1000 samples/sec of 32 channel data using TDMS file write.. i run this vi about 15 mins. its near about 2GB....
but i cant able to read this file after writting... it shown error as" Not enough memory to complete this operation"
03-10-2009 05:53 AM
Could you post your current approach?
Norbert
03-10-2009 06:04 AM
03-10-2009 07:00 AM - edited 03-10-2009 07:03 AM
Balaji,
your loop timing is 100ms, therefore you are writing 10 times the amount of data that you stated in your previous post. I'd say this is the reason for such a huge file....
hope this helps,
Norbert
[Edit]: The reason for you not be able to read the file is because you want to load the whole file at once. so you need an array with about 2GB, which is not possible if you did not activate the large memory awareness. Even if you did, it is very unlikely that it would work because arrays need contiguous memory. Having 2GB of contiguous RAM available is very unlikely.