05-21-2014 03:38 PM
Hello all,
I am currently working to build a tdms database gaining data from network. What I am doing is trying to generate 2000 columns and 1 row data, and my program will keep updating these data per second and write all of them into the a tdms file, then defrag the file. This idea works fine for small-size files, however, if the file exceeds 2 GB, some error shows and my code will be forced to stop running. Is there any suggestions to deal with that problem? I am really curious that whether there will be soutions to defrag and control the size of the file when writing data to that file.
Thanks!
05-21-2014 08:33 PM - edited 05-21-2014 08:34 PM
There is no requirements of the file size for TDMS Defragment. According to the error code, it looks like the TDMS file you produced has some problem, can you check your VI and make sure the produced TDMS file is correctly closed before defragmentation. Or you can use TDMS Flush before close to ensure data is flushed to disk.
05-21-2014 09:22 PM
Hello deppSu,
Thanks for your reply. The reason why I thought the error should have something to do with the size is because I have tried the same method for various file sizes, and the defragment time will increase with the size. Finally, it is impossible to finish defragging the file which is around 2GB.
Also, the VI I use to process defragment is only simply taking the file path and connect to the tdms file which should be closed correctly.
As for TDMS Flush, I have tried it, and it turns out to be very inefficient when writing data to tdms file..
Again, truly thank you for your advice. I am trying hard but have not figure out how to gain both small-size and efficiency.
Sunming
05-22-2014 10:18 PM
Hi, one thing came to my mind is that the process of defragmenting is quite memory consuming. One your file is reached to big size like 2G, the actual memory consumption might surpass what your computer has. That might be the cause.
05-23-2014 08:12 AM
Yep, that can possibly be the reason. Is there any method that I could deal with that problem? Or could I keep the file size small while writing tdms file instead of using such a memory-consunming function?
Thanks,
05-25-2014 08:46 PM
To keep the file size small, you could probably try these links:
05-26-2014 03:19 AM
I have seen this one before,
I can't remeber the context but it was a big file and I don't think it was closed properly (Power out during Write)
As to simplifying the file, you could read the data out of the file 1 column at a time it and store it into another file.
It has a simmilar effect to defragging and may help as a pre-frag.
05-26-2014 08:08 AM
Hi Timmar,
Thank you for your good idea!
Sunming
05-26-2014 08:09 AM
Hi deppSu,
Those two links are really geat. Thank you so much!
Sunming