LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

hdf5 dll documentation?

Hello
I downloaded the spfFile thing for hdf5 support in labview and now im very lost. The online documentation unfortunately only includes the pre-made functions for instrlib, and im having a hard time figuring out what the function calls in the dll actually do. Is there a documentation which somehow fills the gap between the NI-provided docs and the ncsa hdf docs?
adrian
0 Kudos
Message 1 of 18
(3,535 Views)
Might as well post my whole problem:
Im trying to create multiple SD-datasets (one for each channel i record) then append to these 1D sets as the measurement progresses. So if my very limited experience with hdf is right, these should be my steps:

0. Create new file
1. Create space (H5Screate_simple)
2. Create datasets (H5Pcreate) << or what exactly does this do?
3. Set chunk size (H5Pset_chunk)
4. Set attributes/type (H5Dcreate)

And then:

1. extend dataset (H5Dextend) and get space (H5Dget_space)
2. Select hyperslab (who thought of these names?) << what is the "DU64 block" input there?
3. Write 1D U16 array

I know im missing tons of space and file closes, but im not too sure where to stick them...
Can anyone help me a bit with this?
thanks
adrian
Message 2 of 18
(3,533 Views)
I would strongly suggest you look at NI-HWS.  It was designed to solve the exact problem you are facing (multiple, 1D waveforms in a single file).  It is the next generation from sfpFile and is thus HDF5 without the hassle.  It is available on the DriverCD that came with your copy of LabVIEW (unless you copy of LabVIEW is over two years old).

If you want to use the raw HDF5 from sfpFile, there are two layers to the API.  The Advanced VIs are direct exposure of the HDF5 API in LabVIEW (or as close as can be accomplished).  The Intermediate VIs are common functions made up of the Advance VIs.  They are very similar to the HDF5 Lite API, but were designed before HDF5 Lite was available.  For most of what you are doing, you will want to use the Intermediate VIs.

From your post, it appears you are about as confused as I was when I first started using HDF5.  It is not easy for beginners and you need to figure out several low-level concepts to get it to work right.  Once you get the low-level concepts down, things become almost trivial and you realize how much power the API really has.  The sfpFile VIs make poor examples, since they implement a fairly complex file structure (based on the SCPI-DIF format).


0 Kudos
Message 3 of 18
(3,525 Views)
Here is a short overview of what you will need to get started.  HDF5 is composed of several subsystems.  The ones you will need to worry about are the File, Dataset, Dataspace, and Datatype, and Property.

All of the subsystems must be created and closed.  Even though the root API is in C, it has a lot of object-oriented character.  If you want a file, you create it.  You have to close it.  If you want a Property, you need to create it and close it when you are done with it.  Unclosed references are the single most frustrating part of working with HDF5.  They keep the run-time running and files open.  A file does not close until all references to it are closed.  You can close the file reference, but if you have an reference to, for example, a group in the file open, the file stays open.  Since the run-time is a separate process, it stays running and the file stays open even if you exit the calling application.  The cure for this is to run H5close.  This shuts down everything and is invaluable during development.  Don't do this in shipping code, as it will close the HDF5 run-time for all applications using it.

Now for a quick run-down of the subsystems you need.  Property is used by all of the above to govern how things are created/used.  For example, if you want compression in your data, use the property input of Dataset to do this.  Properties must be created and closed when you are done with them.

File is the subsystem which governs opening and closing the "file".  Properties for file include things such as buffered/unbuffered and RAM disk vs. normal.  For basic use, you won't need to set any properties.

Datatype governs the strict typing of your data.  There are a large number of predefined data types.  These are sufficient for normal use.  You don't have to close the predefined types.

Dataspace governs the dimensionality of your data.  This includes both the current size and the maximum size.  Since you are streaming, you will want to set the maximum size to infinity.

Dataset is the generic data object.  This can be anything from a simple scalar to a 100-dimensional data set with billions of data members.  To create a dataset, you will need to set its properties (chunked and compression), set its dataspace in memory and on disk, and its datatype.

Note that the Intermediate VIs take care of a lot of the overhead for this.

To stream, at each iteration, you need to do the following:
  1. Extend the Dataset so it is large enough for your new data
  2. Read the new dataspace of the dataset (get the reference)
  3. Select the hyperslab of the dataspace you will be writing into.  It is called a hyperslab because it can be multi-dimensional and non-contiguous.  You don't need that power, but it is there.  It is really just a subset of the data.  Use this dataspace as the file dataspace when you write to the dataset.
  4. Create a new dataspace (or faster, resize an existing one) that corresponds to the size of the data you will be writing to the dataset.
  5. Now write the data.  The main inputs are your data, the datatype (which you should have from when you first created the dataset), the file dataspace, and the memory dataspace.
  6. Close anything you opened.
This may not make lots of sense at the moment, but hang in there.  You may want to work through the HDF5 tutorial, using the LabVIEW VIs (some are a bit different due to LV interface differences).  If you are still having trouble, let me know.  The learning curve is very steep and I may have just confused you more than helped at this point.  Good luck.
Message 4 of 18
(3,524 Views)
Thanks a lot for your help!
I have a halfway-working program now, just have to kill a few bugs and it should work well enough for what i intend to do.
0 Kudos
Message 5 of 18
(3,519 Views)
OK, so the program runs now, but i need the dll function call that lets me set attributes of the datasets.
thanks
0 Kudos
Message 6 of 18
(3,509 Views)
Use the intermediate H5A calls (Attribute).  The input location should be the dataset reference.  You should be able to store just about anything you want.  If you want to store the timestamp, you will need to modify the VIs, since they were created before the timestamp data type was officially released.  Just replace the cluster of four integers with a timestamp constant and you will be OK.  You will need to change the intermediate and advanced VIs (four in all, I think).
0 Kudos
Message 7 of 18
(3,507 Views)
OK, got that, but how do I set the file attributes (since the file only has a file_id and no loc_id) ?
0 Kudos
Message 8 of 18
(3,490 Views)
^^ I think what i meant by that was how do I set the user block?
0 Kudos
Message 9 of 18
(3,479 Views)
For Attributes, loc_id can be any location - a dataset ID, file ID or group ID.  Attributes are associated with an object, and the loc_id merely identifies that object, whether it be file, dataset, or group.

Which bring me to another thing you may want to know about, but I didn't mention earlier since I was already deluging you with info - groups.  Groups give the Hierarchy in HDF5.  You can think of them as file system folders in a file (although that is not quite correct).  Groups are used to logically organize data in a file.  For example, you may wish to place all your waveforms from one data run in one group (names are totally arbitrary) and from the next data run in a different group.  Attributes can be attached to groups, as well.

Finally, if you haven't installed HDF View yet, you should at your earliest convenience.  It is an HDF file browser and is invaluable in debugging your output.  Get it at the HDF5 website.  Note that HDF5 does a lot of buffering, so unless you have done a flush or closed all references to a file, your items may not show up in the file yet.
0 Kudos
Message 10 of 18
(3,478 Views)