LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

convert large TDMS file to Excel - over the limit of 2^20 rows

Hello everyone, 

in short : my concern is how to convert large TDMS file to EXCEL without loosing data due to limitation of 2^20 rows (20 bits) ?

 

in detail : I record TDMS data, in a append way, then convert to Excel, then import in Matlab and treat all raw data this way. Because there is no direct TDMS to Matlab. 

 

I am recording different data that leads to 8 sheets in the TDMS. This TDMS is generated from data acquisition from a DAQ. Each condition lead to 20 000 points that are appended in 1 sheet of the TDMS file. 

 

Here, I record 20 000 points for 70 different conditions. This lead to 20 000 * 70 = 1 400 000 rows in TDMS. The file size is 105Mo. 

I use a labview vi to convert TDMS to Excel, and usually it works well ... excepted in the present case because Excel is limited to 2^20 = 1 048 576 rows (20 bits for excel 2007) that is 73Mo rather than 105Mo. 

I have no difficulties to open it on MacBook Pro 16Gb LPDDR3 with i7 3.5Ghz, it takes less than 5min. 

 

a) One solution is to reduce the number of points from 20000 to 2^20/(nb of conditions =70) = 14 000 rather than 20 000. I don't want to loose data in my experiment. 

 

b) Another solution is to split TDMS in several sub-files. But ... I will not be so happy with creating a Matlab routine that works in any case (small file, large file). 

 

c) Another way should be the direct import of TDMS to Matlab but I could not find any way to do it. Any idea ? 

 

d) an alternative to excel that allows more than 2^20 rows. Actually, I found something like this http://www.delimitware.com  but I don't know how to implement it in labview to convert TDMS. Second problem with this software is that it is not free, and to Pro version is for 1 year only. Any alternative ? 

 

e) the more I describe my problem, the more I am on the way to find a solution ! Since I have appended 20 000 data for each condition, then I reach some rows limitation in excel (not in TDMS). As I record maximum 8 channels from the DAQ, I have 8 columns with 1 048 576 rows in excel (rather than 1 400 000 rows from TDMS). Why not avoiding APPEND in the same row and rather APPEND a set of 8 columns for the next condition ? In that case, I will have 70 conditions * 8 columns and 20 000 rows, which is OK for excel. The maximum column is 16384, which could allow 16384 / 8 = 2048 conditions at max if all channels are recorded. 

 

So I think e) can solve my problem, but it is also interesting to get a feedback regarding TDMS to Matlab in a more direct way. Is there any developments available ? 

 

I have already read some discussion on the forum about similar topics, so I am sorry if I have missed some existing answer, but I could not find anything related to over 2^20 rows. 

0 Kudos
Message 1 of 6
(1,674 Views)

So when creating your Excel sheets, why enter the data as 1,400,000 rows by 1 column instead of 70,000 rows by 20 columns?

Redhawk
Test Engineer at Moog Inc.

Saying "Thanks that fixed it" or "Thanks that answers my question" and not giving a Kudo or Marked Solution, is like telling your waiter they did a great job and not leaving a tip. Please, tip your waiters.

0 Kudos
Message 2 of 6
(1,664 Views)

thank you for your comment. 

My labview program is fully automated so that I can run 70 experiments (or more)  with different parameters for each experiment. Therefore, 1 column corresponds to 1 physical channel recorded from the DAQ. 

If I split in 70 000 x 20 columns as you quickly suggested, it has no meaning regarding the channel recorded. 

My suggestion e) is similar to your idea, excepted that your "20 col" is instead the number of channel * number of experiment = 8 * 70 = 560 col. And the number of rows will be 20 000, but what is interesting is that I can fully take benefit of more data acquisition regarding the DAQ performances (I think it is about 10^6 max per channel). 

0 Kudos
Message 3 of 6
(1,637 Views)

Export as .CSV instead. I'm pretty sure Matlab can read those. 🙂

 

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 4 of 6
(1,599 Views)

@arienai wrote:

thank you for your comment. 

My labview program is fully automated so that I can run 70 experiments (or more)  with different parameters for each experiment. Therefore, 1 column corresponds to 1 physical channel recorded from the DAQ. 

If I split in 70 000 x 20 columns as you quickly suggested, it has no meaning regarding the channel recorded. 

My suggestion e) is similar to your idea, excepted that your "20 col" is instead the number of channel * number of experiment = 8 * 70 = 560 col. And the number of rows will be 20 000, but what is interesting is that I can fully take benefit of more data acquisition regarding the DAQ performances (I think it is about 10^6 max per channel). 


Ahh I see. Yamaeda's suggestion might work, you could also stick with an Excel Worksheet and split the data into multiple sheets. The functionality is there, you just have to take advantage of it.

Redhawk
Test Engineer at Moog Inc.

Saying "Thanks that fixed it" or "Thanks that answers my question" and not giving a Kudo or Marked Solution, is like telling your waiter they did a great job and not leaving a tip. Please, tip your waiters.

0 Kudos
Message 5 of 6
(1,592 Views)

@Yamaeda wrote:

Export as .CSV instead. I'm pretty sure Matlab can read those. 🙂

 


This was my thought as well.

 

Also, for your smaller tdms files you don't have to bother with converting to Excel in LabVIEW. There is an Excel plugin to directly read TDMS files.

0 Kudos
Message 6 of 6
(1,573 Views)