LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Error when reading back a binary file

Hello,

I am writing out a binary file consisting of U32 arrays, packed strings, and clusters. I get error 116 when reading back. As long as I am synchronized on the correct data position why should LV care what I write to the file? In the example I write a simple 16 element array and a time stamp cluster. When reading back the array I get the error. What is the correct way to do mixed structures? Thanks.
0 Kudos
Message 1 of 10
(5,374 Views)
Can you also show how you read it?
0 Kudos
Message 2 of 10
(5,369 Views)
Please see attached. Thanks for taking the time to look at this.
0 Kudos
Message 3 of 10
(5,363 Views)
David,

You can solve the problem in various ways. The easiest might be to modify the writer VI by wiring a True constant into the "header (F)" input of your first Write File node. That forces LabVIEW to include a four-byte header that specifies the length of the array you are writing.

When you wire up the reader code the way you did, the length of the array you wire to the "byte stream type " input is irrelevant; by wiring nothing into the "count" input, you are implicitly telling LabVIEW to look for a header to determine the length of the array you are asking it to read. Because you didn't write any header, LabVIEW is interpreting the first four bytes of the array itself as a length. That length doesn't match the actual amount of data present in the datafile, so LabVIEW generates your error.

An alternative solution would be to leave the writer unchanged and modify the reader VI: wire an I32 constant into "byte stream type" and wire a 16 into "count". The choice depends on what you prefer and whether or not these binary files need to be compatible with some other program or not.

Another aside: you can dispense with your case that checks whether the datafile already exists. Just change the function input of the Open/Create/Replace to "create or replace" and wire a False constant into its "advisory dialog" input.

Hope that's clear,
John
0 Kudos
Message 4 of 10
(5,337 Views)
John,

Thanks for your explanation. A question: When using the Write File VI with HEADERS=TRUE does every write to the file get a header written? Thank you.
0 Kudos
Message 5 of 10
(5,321 Views)
No, including a header in one Write File call does not force all subsequent calls to include a header.

But keep in mind that a header is only possible when you are writing a dynamic data type (array, string, cluster that contains arrays or strings, etc.). Otherwise, there's no ambiguity about how many bytes Read File will subsequently need to read back out, and no header is needed or used.

--John
0 Kudos
Message 6 of 10
(5,316 Views)
John,

Thanks for the information. For some reason I had tried to use headers before and I wasn't able to read back the correct amount of data. I have rewritten my routines which write out dynamic structures using headers and it all seems to work now. Its nice to be able to simply supply a template of your structure to the byte stream type and let the header do the rest. Thanks again.
0 Kudos
Message 7 of 10
(5,310 Views)


Johnner a écrit:
No, including a header in one Write File call does not force all subsequent calls to include a header.

But keep in mind that a header is only possible when you are writing a dynamic data type (array, string, cluster that contains arrays or strings, etc.). Otherwise, there's no ambiguity about how many bytes Read File will subsequently need to read back out, and no header is needed or used.

--John




Actually, when writing a cluster, headers are always added for variable size data in the cluster. Therefore another solution is to bundle your data before writing it to file.


LabVIEW, C'est LabVIEW

0 Kudos
Message 8 of 10
(5,312 Views)

Try HDF5. It is designed for mixed mode binary data storage. Up sides - fast, efficient, hierarchical, free. Down sides - 64-bit DLL so LabVIEW wrapper is necessary (but it is 64 bit file access), not thread-safe so some caution is necessary, very low-level so user-unfriendly. If I haven't scared you off, you can try it out by downloading the LabVIEW API at Can I Edit and Create Hierarchical Data Format (HDF5) files in LabVIEW? I consider it one of the best tools in my software toolbox, but be prepared for a learning curve. Note, you can ignore all the higher level stuff if you want - it was designed to make saving waveforms easier. It morphed into NI-HWS.

0 Kudos
Message 9 of 10
(5,290 Views)
I discovered that when trying to write and read my variable size data structures. By inserting an Array to Cluster I was able to get the read back to work correctly but I didn't know why. Now I do. Thanks for your response and help.
0 Kudos
Message 10 of 10
(5,289 Views)