From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Opening TDMS files larger than 500Mb in LabVIEW

Solved!
Go to solution

Hello

 

We were facing an issue while opening TDMS files logged using a LabVIEW application, in TDMS Viewer (the vi in Advanced TDMS palette), or by creating a new TDMS reader vi. The logger application is built using TDMS functions in File palette.

 

When the TDMS files are anywhere above 500Mb, either the application hangs or crashes. It may at times give memory full error.

 

I have gone through the forum and found VIs to split large TDMS files into smaller files and then open them. The other option is to use NI DIAdem. Presently we are using the option of DIAdem. We do not want to change the application to split files into smaller sizes.

 

I want to understand why this happens for files of those sizes. I can easily read TDMS files of 100 or 200Mb.

 

Does the system configuration play a role in this? How is it that DIAdem can open the file and LabVIEW cannot?

 

request explanation

 

thank you

Regards
Freelance_LV
TestAutomation Consultant
0 Kudos
Message 1 of 13
(5,686 Views)

I would expect your LabVIEW to be 32bits. Correct?

If so, you are running into limitations introduced by memory fragmentation.

 

Norbert

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 2 of 13
(5,683 Views)

Hi Norbert

 

Yes. I have Win XP 32 bit

 

I too guess that it could be system configuration issue.

 

If i get Win 7 installed on my PC and get LV for 64-bit OS, will it make a difference? I will be able to open the files in LabVIEW easily?

Regards
Freelance_LV
TestAutomation Consultant
0 Kudos
Message 3 of 13
(5,680 Views)

Hi Freelance_LV,

 

Could you please let me know what is your LV version? And, a TDMS file normally has a pair of files on your disk, one is .tdms and the other is .tdms_index, how big of these 2 files? Are they both more than 500 MB? And, did you try TDMS Defragment VI on this file? Any problem if running TDMS Defragment on the file?

 

Thanks,

Yongqing Ye

NI R&D

0 Kudos
Message 4 of 13
(5,673 Views)

Havent tried this yet.

But from a theoretical point of view, loading a file(stream) is like creating a very huge array. So running 64bit LV to read the file should lift the limit to values beyond common current filesizes.

 

DIAdem does a little trick to circumvent this issue on 32bit system by loading only parts of the file, leaving the rest simply indexed. You can implement this approach using TDMS functions yourself. I find it possible (not tested) that the DataFinder Toolkit supplies you with that feature for LV already.....

 

Norbert

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 5 of 13
(5,669 Views)

Hello Yongqing Ye

 

I am using LV 2010 SP1. I have not tried Defragment VI on the files The TDMS file is on 580Mb.

 

I do not recollect the size of TDMS index file. But, i can try them tomorrow morning when I reach office and update you.

 

Thank you

Regards
Freelance_LV
TestAutomation Consultant
0 Kudos
Message 6 of 13
(5,668 Views)

Maybe you want to split up your large TDMS file ?

 

hope this helps


We have two ears and one mouth so that we can listen twice as much as we speak.

Epictetus

Antoine Chalons

0 Kudos
Message 7 of 13
(5,660 Views)

A side comment:

 

There is a difference between "opening the file" and "loading the data into memory".

 

I can open TDMS files up to 1 GB in size on my Win7 64 bit machine.  They open very quickly.

 

However, loading the data into memory is another issue.  To do that, I cache portions of the file and only display what is needed.

http://www.medicollector.com
0 Kudos
Message 8 of 13
(5,636 Views)

Hi

 

As TiTou mentioned, I have already looked at splitting large files. As mentioned in my first post, we do not want to change the application that reads the files unless there is no other option left.

 

As josborne mentioned, I can open the file and get the list of groups and channels. When I load a channel onto a graph, I get the error of memory full and my application closes.

 

So far, we are loading each channel in parts, onto the graph. When we use the TDMS Viewer vi available in File palette functions, we can load specific number of samples on the graph (settings button at the bottom of the graph). That is what we are presently doing. This will not allow us to analyze the entire channel data together. We have to analyze the data in parts.

 

We do not have DIAdem on all our test systems, only the development PC has one.

 

To Mr. Yongqing Ye's suggestion, I tried defragmenting the file this morning and it still cannot be loaded on the graph in a single go.

 

the index file size before degragmenting was about 902kB. the file size is about 558Mb. After defragmenting, the index file size is 1kb.

 

What I want is to know if shifting to Win 7 and LV 64-bit will solve this issue, without splitting the files every time, or without using TDMS Viewer vi.

Regards
Freelance_LV
TestAutomation Consultant
0 Kudos
Message 9 of 13
(5,620 Views)
Solution
Accepted by topic author Freelance_LV

@Freelance_LV wrote:

[..]

What I want is to know if shifting to Win 7 and LV 64-bit will solve this issue, without splitting the files every time, or without using TDMS Viewer vi.



Most likely since the channel data is placed in a huge array and memory fragmentation makes this very difficult on 32bit OS with LV 32bit. Another thing which might be an issue on your 32bit LV: Showing the data as whole creates a data copy. So maybe the original data of the channel might fit in memory, but trying to display it blows the memory away.

 

Another thing for 32bit applications is to check the largest block of data you still have free. Use tools like VMMap to snapshot your application for memory fragmentation.

 

As i already wrote, a generic 64bit application raises the limited to "insane" values depending on the OS like Windows 7....still, there are limits.

 

Norbert

 

EDIT: The Windows 7 link goes to the table of physical memory for the whole system. This does not reflect the virtual memory available for a single process. Scroll up to the table "Memory and Address Space Limits" for a figure of virtual memory which can be assigned to a 64bit process on Win7 x64. The value is 8TB.....

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 10 of 13
(5,610 Views)