04-25-2019 09:25 AM
I acquire data at a high sampling rate (1.5GS/sec) on 2 channels and write them to a tdms file. Each tdms file can reach upto 27.4 GB. I am not sure if this is the right forum to ask, but what computation software is the best one to analyze such huge data files. I tried Python and it takes for ever to load the data to a pandas data frame and do some number crunching.
Any advise?
04-25-2019 09:39 AM
04-29-2019 09:20 AM
Thank you!
I do not de-fragment the file during the code execution. Is it wise to include that in my VI or do it manually after have saved the full file on the hard drive.
There is a TDMS_index file associated with every *.tdms file but the index file is of very small size compared to the tdms file. Does this mean there is no need of further de-fragmentation ?
04-29-2019 11:15 AM - edited 04-29-2019 11:17 AM