LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Saving large amounts of data in a single file, and as it is collected

Hello again, CatDoe, and thank you for appending Smplepracticeexcelsave.vi.  Here are some (nested) comments.

  • If speed and reliability and data size are issues, you should not use the Report Generation Toolkit to save your file as an Excel file.  Go with straight text.
    • The RGT requires that you open (and close!) Excel, which definitely takes time and memory space.  There is also RGT processing to consider.
    • I definitely recommend "Excel Easy Table" over "Append Table to Report" -- it is simpler, more flexible, and probably faster.
    • Handle the Header yourself with Excel Easy Table.
    • If you are going to append data to an existing Excel Table, you'll want to use "Excel Get Last Row".
  • Consider using Write Delimited Spreadsheet instead.  You are writing 1D and 2D Arrays of Strings, which is perfect for this function.  If you write, say, 1000 lines having 19 columns of data, starting from an empty file, I would guess you could write (appending to the file) at a rate of several Hz.  But why guess?  I'll try to code this up and attach a demo for you to consider.

Bob Schor

0 Kudos
Message 11 of 24
(2,052 Views)

 

  1. What rate are you collecting data?
  2. What rate do you want to save data?
  3. How much data are you saving (per save)

Appending might seem "inefficient" but with the proper program architecture that is irrelevant.  

========================
=== Engineer Ambiguously ===
========================
0 Kudos
Message 12 of 24
(2,052 Views)

Thank you!

0 Kudos
Message 13 of 24
(2,049 Views)

Hello RTSLVU,

 

1. Unfortunately, I do not know that information. I could collect a single row of data or over 30,000 in a day. It simply depends on the operator. The speed of individual elements is instrument dependent and varies over each collection period.

 

2. At this time, data is collected during various times in my message handling loop and stored in a cluster. Then the cluster data array is sent to the file. Data cannot be collected in a single sweep as multiple instruments and steps are involved.

 

3. Right now I save a data array at the end of the program run (old methods) or append a single row to a file (current method). This is not ideal as a program crash leads to data loss, which is very very unacceptable in the system's use. Therefore, I need some way to keep most of my data. A single row of data loss is "acceptable" if absolutely necessary, but more than that is unforgivable for this system.

0 Kudos
Message 14 of 24
(2,043 Views)

As promised, here is a demo I whipped up.

DEMO Text Logger.png

This starts by defining the output file, here "Text Logger.txt", in LabVIEW's Default Data Directory (usually the LabVIEW Data folder).  The first For loop creates an Array of 19 Strings, "Col 1", "Col 2", etc. and writes it (without appending, so it creates a New File) for the Header Row.  The second For loop does 100 "Appends" of 1000 Rows of 19 Column data (from the Random Number generator), saved with three decimals of precision (so something like "0.123"), and then writes this 1000 x 19 2D Array to the Log file. 

 

When I run this, I get a file of 11,231 kB, a text file I can open and see that it has 19 columns and 10,001 rows (don't forget to count the Header!).  How long did this take?  Would you believe 20 seconds?  Nope, less than 2 seconds.  Fast enough for you?  This means you can save data from 19 channels sampled at 1 kHz in 0.02 seconds, not too bad ...

 

Bob Schor 

Message 15 of 24
(2,031 Views)

@CatDoe wrote:

Hello RTSLVU,

 

1. Unfortunately, I do not know that information. I could collect a single row of data or over 30,000 in a day. It simply depends on the operator. The speed of individual elements is instrument dependent and varies over each collection period.

 

2. At this time, data is collected during various times in my message handling loop and stored in a cluster. Then the cluster data array is sent to the file. Data cannot be collected in a single sweep as multiple instruments and steps are involved.

 

3. Right now I save a data array at the end of the program run (old methods) or append a single row to a file (current method). This is not ideal as a program crash leads to data loss, which is very very unacceptable in the system's use. Therefore, I need some way to keep most of my data. A single row of data loss is "acceptable" if absolutely necessary, but more than that is unforgivable for this system.


1. So the shortest interval you can reliably collect data would be the update rate of the slowest instrument. Any idea what that is?

 

2. Multiple instruments is irrelevant. If there's "test steps" to perform between measurements then that also determines the shortest rate you can save data.  

 

3. Sounds like you have an inefficient program architecture to begin with. I don't think trying to kludge in a "faster file format" is going to fix your overall issues. 

 

We really need to see your entire program. But I think a lot of your issues can be better handled by a Channeled Message Handler. Consider each of your instruments running in its own loop continuously collecting data as fast as it can. Using the proper Channel Wire as a "lossy Queue" from each instrument loop. Those Channel Wires will always contain the most recent data and you can combine and save that data in other loops at the require interval one line at a time.

========================
=== Engineer Ambiguously ===
========================
0 Kudos
Message 16 of 24
(2,014 Views)

@Bob_Schor wrote:

As promised, here is a demo I whipped up.

DEMO Text Logger.png

When I run this, I get a file of 11,231 kB, a text file I can open and see that it has 19 columns and 10,001 rows (don't forget to count the Header!).  How long did this take?  Would you believe 20 seconds?  Nope, less than 2 seconds.  Fast enough for you?  This means you can save data from 19 channels sampled at 1 kHz in 0.02 seconds, not too bad ...


You can do a lot better by avoiding the Write Delimited File, which opens and closes the file each time it is called.  Instead, use the file IO primitives (Open/Create/Replace File to create the file, Write Text File to write the data, and Close File to close it) so that the file is only opened and closed once.  Use Array To Speadsheet String to format the data.



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 17 of 24
(1,979 Views)

Thank you for suggesting the channel handler. In this case I am restricted to making the most out of the message queues. I cannot share the whole program due to privileged information. 

0 Kudos
Message 18 of 24
(1,974 Views)

Thank you for the demo Bob_Schor. I will use this as a starting point and see if I run into any more questions. 

0 Kudos
Message 19 of 24
(1,963 Views)

The purpose of the "Demo" was as a "gentle introduction" for @CatDoe who had what appeared to be periodic needs to save data of some unknown size.  LabVIEW's Write Delimited Spreadsheet does pretty much what @crossrulz suggests -- after collecting 1000 "data records", they are written, all-at-once, to an existing file by first going to the end of the file and then writing the data.  It is a feature of this demo that the file is opened, written to, and then closed, the assumption being that data are being generated at a much slower rate than the writing speed, so opening, appending, closing (2% of the time), and then waiting with the file safely closed (98% of the time).  Sometimes "simpler" isn't so bad ...

 

Bob Schor

0 Kudos
Message 20 of 24
(1,961 Views)