LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Attempt to read large TDMS file causes lack of memory error

Hi, I am trying to read a TDMS file using the Read from Measurement File express VI and am getting a message that there is not enough memory to complete the operation.  My file is 263MB and has approximately 10M samples of 4 channels.  How can I open these TDMS files to extract the data using Labview Suite 8.6?

 

Thanks,

Mike V.

0 Kudos
Message 1 of 13
(6,620 Views)

Hello Mike,

 

Thank you for contacting National Instruments!  I have seen issues similar to this a few times in the past.  Usually the best way to get around memory issues with TDMS files would be to try reading in your TDMS files in segments.  There should be a section in the Read from Measurement File express VI that you can specify how many samples you wish to pull from the TDMS file. 

 

Another option would be to bypass the Read from Measurement File and create a program that is similar to what is seen in this knowledgebase article.  You can place the specific TDMS file VIs and break it up using the count and offset to avoid the out of memory error.  That knowledgebase article is not specific to the memory error but shows that when implementing a solution to open the TDMS files in different subsets you can sometimes run into an error. 

 

Please try some of these methods and verify they will work for your solution.  Reply to this post if you are still having issues when trying to implement these solutions.  At that time please include more specific details about the error, such as the error number (if there is one) and exact wording of the error description.  With that information I will be able to further determine the reason for the error.  I hope this information helps!  Have a great weekend!!

Thanks!
Bob H
Applications Engineer
National Instruments
0 Kudos
Message 2 of 13
(6,604 Views)

Recently, I have a headache in managing the OS memory. 

 

I have met the same problem in reading the large TDMS file (>250MB, multichannel DAQ, >1MS/s for >10sec) under LV 2009-32bit/4G Memory/Windows7 in PXI.

 

The effective memory the OS recognized is 3G and the automatic virtual memory in OS was 3G too.

 

The task manage in OS shows only 1.1G memory was allocated, but TDMS read.vi gave the error of memory full.

 

In my application, I can't use the option of reading segments in whole file because it is the single spectrum.

 

1) For the first time, Can  you explain the reason why LV didn't use the remained memory (~2G)?

In task manager, I couldn't see any memory increase when I tried to read the TDMS file.  

 

2) If I transit to 64bit LV2009, can this kind of memory trouble be solved?

 

As you may know well, PXI have the single memory slot.

3) Is there a way to expand the memory more than 4G? 

 

labmaster.

 

Message Edited by labmaster on 05-30-2010 08:24 PM
Message Edited by labmaster on 05-30-2010 08:24 PM
0 Kudos
Message 3 of 13
(6,158 Views)

The tutorial Managing Large Data Sets in LabVIEW should answer most of your questions.  Data sets of this size need to be actively managed so they do not run out of memory.

 

Going the 64bit LabVIEW will help with some of the issues, since you will be able to directly address the full memory of your computer.  You will still be limited to I32 sized arrays (number of elements, not memory size), for the moment.  However, with a 250MBytes+ sized data set, it is still possible to fill a 4GByte memory fairly easily.  Active management will take care of this issue, and also give you a bonus of increased performance.  You should be able to reduce your number of data copies to one or two, depending on your application, meaning you really do not need a 64bit system.

 

The maximum memory of your PXI controller will depend on the controller model.  See your user manual for specifications. 

0 Kudos
Message 4 of 13
(6,122 Views)

The Express VIs are built so they make the most common use cases straightforward to implement, but they are not optimized for either high performance or large amounts of data.  I'd recommend to run the "TDMS File Viewer" VI from the File I/O->TDM Streaming palette and see whether it can open the file. If so, you can easiliy read data from the file using the functions on the "TDM Streaming" palette.

 

Herbert

0 Kudos
Message 5 of 13
(6,119 Views)

>However, with a 250MBytes+ sized data set, it is still possible to fill a 4GByte memory fairly easily.

 

That's my core question. The task manager showed LV2009 didn't use the full memory.

I read the knowledgebase document. 

 

I transfered the example data and rountine to a NI engineer.

According to kind NI engineer, most specialists recommended to use Diadem for large data management.

Despite this alternative way, I don't understand why Diadem can treat the large data but LV can't.

(Using Diadem, I have to change my whole frames. more extra cost.)

 

Anyway, at this moment, I believe LV have a big trouble in management of the memory.

Let's wait for his response.

I will update the progress but if you have an idea, please let me know to try more by myself. 

 

labmaster.

*)If anyone want to try my examples, please leave your contact address.

Message Edited by labmaster on 06-02-2010 11:52 PM
0 Kudos
Message 6 of 13
(6,070 Views)

labmaster wrote:

>However, with a 250MBytes+ sized data set, it is still possible to fill a 4GByte memory fairly easily.

 

That's my core question. The task manager showed LV2009 didn't use the full memory.

I read the knowledgebase document. 

 

I transfered the example data and rountine to a NI engineer.

According to kind NI engineer, most specialists recommended to use Diadem for large data management.

Despite this alternative way, I don't understand why Diadem can treat the large data but LV can't.

(Using Diadem, I have to change my whole frames. more extra cost.)

 

Anyway, at this moment, I believe LV have a big trouble in management of the memory.

Let's wait for his response.

I will update the progress but if you have an idea, please let me know to try more by myself. 

 

labmaster.

*)If anyone want to try my examples, please leave your contact address.

Message Edited by labmaster on 06-02-2010 11:52 PM

[Set Sarcasm = True] 

 

Having trouble handling a graphic languages? Then switch over to this text based langauge that has little or nothing in common that cost more and is more limiting.

 

[Set Sarcasm = False]

 

I believe that almost every memory related issue is covered by one or more of the tags in this Tag Cloud related to LabVIEW_Performance.

 

The problem in a nut-shell

 

Windows uses Virtual Memory and by default only exposes 2 Gig for our code.

 

LV manages your data buffers for you and does so by keeping them in contiguous memory spaces.

 

If your code (or the Express VI code [which is often not the most efficiently written code] ) requires another buffer and it can't find a coniguous block big enough you get an "out of memory error".

 

So to get the most out of your machine you have to work smarter and the words "smarter" and "express VI" normally don't appear in the same sentance.

 

Converting the Express VI to a normal VI and then playing "wack-a-mole" (eliminate buffer copies see tag cloud).

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 7 of 13
(6,050 Views)

Ben,

 

Thanks for the comments.

Whatever programming language or OS, I believe each language can use the memory which allocated by OS.

You mentioned the virtual memory (disk-swapping) but to my knowledge, OS can use the virtual memory only if the physical RAM is out of space.

 

I think my problem is different a little bit.

When I opened TDMS file (>250MB), OS didn't use fully the allocated RAM memory (no memory overshooting to 2 or 3G in the task manager).

Even with the LV TDMS file viewer, I and the engineer checked the same problem.

(Though I can agree the contigious memory block when I write the TDMS.)

 

However, if you meant LV can use 2G RAM only, can you expect the case with LV 2009 64 bit?

That's one of my question originally.

 

Best,

 

labmaster.

0 Kudos
Message 8 of 13
(6,041 Views)

I would not expect LabVIEW to use the full memory for the following reasons:

 

  1. Like most other programming languages, LabVIEW requires a contiguous space for an array.  Depending on the fragmentation state of the memory, there may not be enough space to fit another 250MB data set.  LabVIEW itself fragments memory due to the DLLs it loads.  On a 32-bit OS, the largest contiguous data space available is about 800MB for LV 8.0 and above (about 1.1GB for 7.1 and earlier).
  2. In a 32-bit OS, Windows reserves anything over the bottom 2GB for system code.  This limits you to a maximum of 2GB usage.
  3. In addition (once again, 32-bit only), the space from 1.5 to 2.0GB is usually reserved for system DLLs, as well.  This limits you to a maximum of about 1.5GB usage.

 

While "it is still possible to fill a 4GByte memory fairly easily" if you do not know what to do, it is also possible to handle a 250MB data set with LabVIEW.  I must disagree with my kind colleague on this one.  DIAdem is an analysis and presentation package; it is not a programming environment.  Handling large data sets is a challenge in any language.  The solutions are simply different in LabVIEW than in C/C++/C#/Java/...  I handled 128MB data sets in LabVIEW 6.0 on a machine with 512MB of RAM nearly ten years ago.  LabVIEW has progressed quite a bit since then and made the process much easier, although the basics have not changed.

 

  1. Your data should be encapsulated in a reference object to avoid trivial data copies.  In order of preference these are i) Data Value Reference, ii) Single-Element Queue, iii) LabVIEW 2 style global (AKA functional global or action engine).
  2. Use the buffer viewer to find where you are making copies and eliminate them, if possible.
  3. Use data chunking techniques for transport and analysis if you cannot eliminate copies.
  4. Use Task Manager to ensure you found where all the copies are being made.
  5. If all else fails, use the disk as a buffer (the National Instruments waveform editors use this technique to allow editing of waveforms of arbitrary size).
The major challenge with LabVIEW is that it abstracts memory management away from the user.  This makes LabVIEW much easier to use, but results in confusion when the memory management makes more copies than "necessary" and your data sets are large.  It is possible to manipulate the memory manager using the techniques given in the tutorial and above.  There is no reason you cannot do what you wish to do in LabVIEW.  If you run into specific issues, please ask.  Or post your code or the critical pieces of it so we can make suggestions.

 

Message 9 of 13
(6,035 Views)

DFGray wrote:

I would not expect LabVIEW to use the full memory for the following reasons:

 

  1. Like most other programming languages, LabVIEW requires a contiguous space for an array.  Depending on the fragmentation state of the memory, there may not be enough space to fit another 250MB data set.  LabVIEW itself fragments memory due to the DLLs it loads.  On a 32-bit OS, the largest contiguous data space available is about 800MB for LV 8.0 and above (about 1.1GB for 7.1 and earlier).
  2. In a 32-bit OS, Windows reserves anything over the bottom 2GB for system code.  This limits you to a maximum of 2GB usage.
  3. In addition (once again, 32-bit only), the space from 1.5 to 2.0GB is usually reserved for system DLLs, as well.  This limits you to a maximum of about 1.5GB usage.

 

While "it is still possible to fill a 4GByte memory fairly easily" if you do not know what to do, it is also possible to handle a 250MB data set with LabVIEW.  I must disagree with my kind colleague on this one.  DIAdem is an analysis and presentation package; it is not a programming environment.  Handling large data sets is a challenge in any language.  The solutions are simply different in LabVIEW than in C/C++/C#/Java/...  I handled 128MB data sets in LabVIEW 6.0 on a machine with 512MB of RAM nearly ten years ago.  LabVIEW has progressed quite a bit since then and made the process much easier, although the basics have not changed.

 

  1. Your data should be encapsulated in a reference object to avoid trivial data copies.  In order of preference these are i) Data Value Reference, ii) Single-Element Queue, iii) LabVIEW 2 style global (AKA functional global or action engine).
  2. Use the buffer viewer to find where you are making copies and eliminate them, if possible.
  3. Use data chunking techniques for transport and analysis if you cannot eliminate copies.
  4. Use Task Manager to ensure you found where all the copies are being made.
  5. If all else fails, use the disk as a buffer (the National Instruments waveform editors use this technique to allow editing of waveforms of arbitrary size).
The major challenge with LabVIEW is that it abstracts memory management away from the user.  This makes LabVIEW much easier to use, but results in confusion when the memory management makes more copies than "necessary" and your data sets are large.  It is possible to manipulate the memory manager using the techniques given in the tutorial and above.  There is no reason you cannot do what you wish to do in LabVIEW.  If you run into specific issues, please ask.  Or post your code or the critical pieces of it so we can make suggestions.

 


 

THe best method that I have found for using the max amount of memory used a sepearte queue for many small updates. Small contiguous blocks are much easier to find. In addition the queues can transfer the data in-place so if used properly they have given me the best results.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 10 of 13
(6,028 Views)