10-17-2011 07:18 AM
Dear Labview Forum,
i am currently working on a Project which saves its data to HDF5 file format.
(TDMs is no option for me).
i am using the c-dll of HDF5 via Call Library Function node.
Basically everthing works fine, but sometimes the library calls fail. i am not able to say when and why.
the only thing i know is that when it doesen't work it never works until i restart the Labview IDE.
if it works, it works until i restart my IDE, and sometimes after the restart i get only -1 as returns.
i attached a little project to show the error.
to reproduce the error someone has to restart the whole IDE and delete the created files until the error occurs.
Create Group and Create File works always. The errors occur from Create plist and ongoing calls.
if one of the failning sub vis front panel is openend and manually started by pressing the run button it all time works nice.
i tried to relaod the called vis for each call -- same errors
i compiled the vis to an exe -- same errors
thank you for your help.
10-17-2011 07:32 AM
Hi talorion,
Could you please explain the reasons why you chose HDF5 instead of TDMS? I'm quite interested in the reasons.
Thanks!
10-17-2011 07:46 AM
Thank you for your reply,
i choosed HDF5 because
- we already have a huge data-base (i mean many hdf5 files) and stay compatible to this.
- also our current toolchain is based on HDF5.
- HDF5 is a Dataformat which is known for its performance.
- TDMS is much improved since it's first release, but it simply cannot handle all that HDF5 can handle.
for futher information about TDMS vs HDF5 see:
http://forums.ni.com/t5/LabVIEW-Idea-Exchange/Suggest-that-LabVIEW-support-HDF5/idi-p/1206363
i am afraid i have no choice but to use HDF5.
10-17-2011 08:01 AM
Thank you very much for so quickly reply!
Generally speaking, HDF5 file format is open and there are quite more 3rd party software support it (like java-based viewer mentioned in that post). However, like Herbert Engels mentioned, TDMS sometimes also supply better performance in the benchmark he mentioned. I'm curious, do you have any particular use cases which only HDF5 can work while TDMS cannot help, like you performance, or the 3 levels of TDMS (file/group/channel) in TDMS, etc.
Thank you again for your time!
10-17-2011 08:25 AM
Dear guys
Sorry for my interuption but I think this is not at discussion about the advantages of HDF5 or TDMS.
I also try to use HDF5 without a wrapper and I am also relatively successfull but have similar problems.
It seems that it could work and eventually it is a tiny simple trick to make it 100% solid.
Hopefully there is somebody out there who has a hint.
Cheers
10-17-2011 08:48 AM - edited 10-17-2011 08:49 AM
The symptoms you describe would hint in the direction of uninitilized output buffers to those function calls. I haven't looked at the HDF5 API in a long time, so don't know the details anymore how this API works nor what data types it accepts. But it pretty surely will comply with standard C programming notions, which means when you call a function that writes information into some buffer, you have to make sure those buffers are properly preallocated before calling the function. One or more bytes to little can often work for quite some time, and suddenly crash, cause exceptions or cause computation errors.
Why it suddenly misbehaves at some point can be very random. Your mentionening that opened front panels seem to make it always work are an additional hint in this direction. They increase the memory foot print of your LabVIEW program and make it more likely that a buffer overrun won't immediately fail, though given an application that runs long enough it will at some point fail in some way too. There is nothing LabVIEW can do to prevent these buffer overruns other than trying to detect them and throwing an exception (indicated by an error in the Call Library Node error cluster). Depending on your CLN debugging level, this is done rather stringent, casually or not at all.
If your API only returns it's information in simple arrays then you can make it work with direct Call Library Node calls but need to be very wary about any output buffers those functions require. If however your API requires complex datatypes (structures with arrays and more liek this) then trying to get this done without a wrapper DLL is mostly an exercise in vain. The knowledge about how a C compiler lays out memory for such datatypes, which is required to be able to do that, is a lot more complicated than just writing a C DLL. So you make the problem in fact more complicated and sometimes basically impossible by trying to avoid the wrapper DLL.
10-17-2011 08:15 PM
Unless the HDF5 DLL has changed dramatically in the last few years, many of the enums are not constants, but macros. LabVIEW cannot access DLL constants, so cannot read these. If you hard code values from them into your LabVIEW code for something like setting parameter lists or picking data types, it will sometimes work and sometimes not work. The fix is to write a DLL wrapper which converts these macros to integers using a switch statement, then call the conversion function from LabVIEW when you need the values.
I would like to second Rolf's advice, as well. When I first tried to interface to the HDF5 DLL a decade ago, I did not realize that many of the parameters were 64-bit integers. I used 32-bit integers. They actually worked most of the time... until the application crashed. At the time, LabVIEW did not support 64-bit integers. You can find my solution here (see the end of the page).
When I fixed both of these issues, I got a stable system.
Good luck. Let us know if this helps.
10-19-2011 12:38 AM
Dear Guys
Thank you for all your very imressive inputs as well as the link to the "NI large file tutorial".
For me it seems to be solved now.
A lot of testing yesterday showed no problem any more.
I have done the following things.
One one hand I used the 64bit numbers in the dimension to create the Dataspace and I also passed these values as "constant" to the call library function node.
Now I get the refernereces all the time and can create the Datasets without any errors.
Up to now manly writing procedures have been done, but also fundamental reading worked very well (but was not tested so intensely)
Thank you again
Best regards
Nottilie