From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

displaying large amounts of data -- strategies

Hi all!  I am a new LabView user (previously a VEE user) and am doing something new to me, data acquisition.
 
I was wondering what other people have done to display and visually analyze large amounts of data, either within LabView or using some other application.  Here is my data acquisition scenario:
1) I am collecting digitized voltages on approximately 50 channels at 1kS/s, reading it in blocks of 100 samples and writing to one set of ASCII files in tab separated columns (one column is one data channel) and time stamping the block read time
2) I am collecting accelerometer data on about 5 channels read at 10kS/s, also reading in blocks of 100 samples and writing to a different set of ASCII files in tab separated columns as above
3) I am also collecting thermocouple data on a few channels (<5) read at 1S/s, stored in a third set of ASCII files in tab separated columns with each reading timestamped
4) As you've likely noticed, items #1 & #2 grow rather quickly, so those files are closed out at 64,000 samples and new data files are opened
 
What I'd like to do, is to be able to select several channels of overlayed data to view at a time, closely time aligned (I know there will be some skew and that is acceptable) and be able to scroll through the data looking for "events" (of unknown characteristics) centered around the time of UUT failures.  I'll also have to be able to extract data from newer/older versions of data files due to closing out data files when they reach a certain size.
 
Any ideas for dealing with this type of situation would be greatly appreciated!  (What a way to learn LabView!)
0 Kudos
Message 1 of 5
(7,997 Views)

You didn't say how long your acquisition will be, but assuming each measurement of the first two kinds is just a DBL, you get about 400 KB\s. That's managable straight up if your acquisition is not too long and if you don't create data copies (which requires you to know what you're doing). If the acq includes a timestamp for each point, that becomes around 1 MB\s.

Some relevant points:

Managing Large Data Sets might help you.

If you know anything about working with databases, that might help you for keeping the data, although it will require some additional work.

You might be able to use formats like TDMS to do this. I don't have any experience with them, so I can't say.


___________________
Try to take over the world!
0 Kudos
Message 2 of 5
(7,983 Views)
Good point on duration of data acquisition.  It is somewhat an unknown (another benefit of writing multiple data files, if no UUT failure occurs, older data files can be deleted).  As the scope of this wasn't well-defined prior to developing a DAQ system, I've allowed for storing a LOT (4TB) of data.  I do have some expectation that no single test will likely go over 90 minutes and will probably result in a failure in something more like 20-40 minutes.
0 Kudos
Message 3 of 5
(7,974 Views)

Without looking into the other options, this is what I would do. It's quite possible there are better options.

This assumes that the reviewing part is done after the logging is done, although it doesn't have to.

Accumulate the data in a circular buffer so that you don't have memory issues. When the buffer is full, you write the data in it to the file. This can be implemented, for example, as an action engine. I'm attaching a simplistic example of how you create a 1D circular buffer. Yours might need to be more clever and include a running index (so that you can hold more data if the buffer is filled), but it shows the basic concept.

When you want to review, you start by asking the user to select which sensors to display. Once they do that, you go over all the files, one at a time, and load the data into another action engine while decimating it. That way, you can keep using that data to display the entire graph. Whenever the user zooms in (using buttons you place on the front panel), you load the data from the few relevant files and display in the graph in all its detail.

As I said, there might be better options, but these are just some quick thoughts.


___________________
Try to take over the world!
Message 4 of 5
(7,955 Views)
Your disk storage strategy should really be determined by what you plan to do with it after you store it.  Given the amounts of data, I would recommend something other than a spreadsheet for analysis.  Spreadsheets start to get unwieldy after about 2000 points.  Being a staunch NI employee, I can shamelessly recommend LabVIEW for post analysis (or on-the-fly analysis, for that matter).  You can also use an analysis package such as Mathematica, MathCAD, or DIAdem.  You should save your data in a format that can be easily read by the analysis package of your choice.

Converting to ASCII is relatively slow.  Given a 1MByte/sec data stream, you may have problems streaming to disk, depending on the age of your system.  Opening new files frequently will also slow you down.  To avoid this, I would recommend a hierarchical, binary file format.  There are two "easy" possibilities in LabVIEW: TDMS and HDF5 (or NI-HWS, which uses HDF5).  TDMS is NI's new binary format designed for streaming huge amounts of parallel data.  You can get import filters for Excel and Matlab on the NI website.  TDMS is fully supported by DIAdem.  You will need LabVIEW 8.2 or later to use TDMS.

HDF5 is an open file format maintained by the NCSA.  NI does not support it very well, but the toolkit available will do what you need to do.  Unfortunately, it is an older version of HDF5 (1.4.4).  In addition, HDF5 has a very steep learning curve.  I feel it is worth it, but you may not have the time.  HDF5 is supported by most major analysis packages.  Note that there have been at least three other HDF5 implementations for LabVIEW that I know about.  A quick web search should give them to you, if you are interested.

Good luck.  Be sure to read the large data tutorial (it is a bit advanced for a beginning LabVIEW user so let us know if you have any questions).
0 Kudos
Message 5 of 5
(7,929 Views)