LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Writing many small waveforms to an HWS file

Hello,

I am writing many small waveforms (about 100 samples long) to disk and have been using the niHWS vi's for this purpose.  You can learn more about what I am trying to do with my data acquisition from this post, if you're interested.

I am using the low level write vi's as they are used in the example "niScope EX Save to File - HWS Low Level - Multi Channel.vi".  I change the group name for each triggered waveform,  if I have mutiple channels enabled I put each set of triggered waveforms in a single group. The problem is with the size of the file that is written.  For example I acquire 1000 waveforms that are 100 samples long and with my digitizer (PXI-5105) I expect the waveforms to be about 640 bytes long, so 1000 waveforms in a file is about 640 kB  plus the scaling and timing information for each waveform might put it at a 1 MB file size, very conservatively.   Instead I get file sizes of 74 MB with the file compression set to 0 in the "niHWS New Wfm Reference.vi", or 12 MB for a compression setting of 9.  So my file sizes are at least 12 times too big with full compression.  Eventually I will want to put hundreds of thousands or possibly millions of waveforms in a file and you can imagine that file sizes become a problem for disk space and waiting for the system to write them to disk.

I believe the HWS vi's are optimized for longer waveforms, they probably allocate a minimum length for each group in the file.  Is there any setting to change the size that is allocated for each group?  Or should I just come up with a file organization of my own and write files using the binary file vi's?

Thanks for your help.
0 Kudos
Message 1 of 4
(2,973 Views)
Hi ESD,

Nice to hear from you again 🙂

HWS is a rather rigid binary format, so I think you would benefit from changing file types. In HWS, each waveform in a group requires a minimum of 72 kB, and these chunks grow in 64 kB increments. A single chunk can hold up to 32,500 2-byte samples before another will be added. Since your waveforms are only 100 samples long, you're well under the first tier.

You could use the HWS format more efficiently if you concatenated waveforms before writing them to file -- you could hold 325 records in a single chunk and then start a new group. Reading this information back will take some intelligence since you'd have to know how you packed them in when they were written.

Alternatively, you could write a pure binary file and do similar parsing on read. This option will give you the most efficient use of disk space.

Finally, if you'll be looking at millions of data points, you'll need to choose a way to view and process them. Excel supports 64k rows (where k is 1024), so you may need to roll your own data manipulation as well. NI does offer a high-volume data processor, DIAdem [1], that can support up to 2 billion data points. It uses data plugins to parse custom binary files, so if you choose to write to binary files, you will be able to import your data quite simply.

[1] DIAdem
http://www.ni.com/diadem/
Joe Friedchicken
NI Configuration Based Software
Get with your fellow OS users
[ Linux ] [ macOS ]
Principal Software Engineer :: Configuration Based Software
Senior Software Engineer :: Multifunction Instruments Applications Group (until May 2018)
Software Engineer :: Measurements RLP Group (until Mar 2014)
Applications Engineer :: High Speed Product Group (until Sep 2008)
0 Kudos
Message 2 of 4
(2,946 Views)
Joe F.,

Thanks for the information.  I figured that I would be better off just writing a binary file.  As for viewing and analyzing my data, I plan to write my own tools and the number of data points will not be a problem.  The framework I use for building the analysis program has tools for reading hdf files, so the HWS vi's would have been convenient in this respect.
0 Kudos
Message 3 of 4
(2,935 Views)
As mentioned earlier, there is an underlying assumption that HWS will be used for large waveforms.  Part of this optimization is that the minimum waveform size is set to 65,000 bytes (this is the compression chunk size).  This optimizes write speed to disk, but gives huge amounts of extra space if you don't use it. Unfortunately, although you can query it, you cannot change this parameter after it is created, and HWS does not expose it.  So, if you want to use HWS, you should probably concatentate multiple waveforms to get the size up to about 65,000 bytes.  This will improve your disk write time, as well.
0 Kudos
Message 4 of 4
(2,924 Views)