LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW created UDL file causes error

Solved!
Go to solution

I am using LabVIEW 2019 running on Windows 11 laptop.

 

I was given a UDL file to use to connect to an Access mdb file.

 

I wrote a LabVIEW 2019 VI to create a duplicate file with the same content but a different name.

 

The duplicate UDL file causes the original VI to throw an error.

 

Supposedly the UDL file is just a text file with a different file suffix, but something is different about it.

The original UDL file has property of 404 bytes.

The duplicate UDL file has property of 199 bytes.

 

Since this forum does not allow .udl files, I have renamed them to .txt  but the file sizes are not changed.

 

Download All
0 Kudos
Message 1 of 8
(1,313 Views)

Your mistake was to assume that the UDL file was a "Text" file, that is, a file of bytes consisting of ASCII characters that spell out something like "Everything after this line is an OLE DB initstring".

 

Instead, the file is a binary file, consisting of a 2-byte Header (hex FF FE), and then "16-bit ASCII", with the low Byte being the ASCII characters (including CR and LF (hex 0D and 0A) and the high byte being NUL (hex 00).  The proper way to read/write this file is to treat it as a Binary file, not a text file.  So "Everything" in the "real" File looks like "E<nul>v<nul>e<nul>r<nul>y<nul>t<nul>h<nul>i<nul>n<nul>g<nul>".

 

Bob Schor

0 Kudos
Message 2 of 8
(1,299 Views)

http://www.labview.help/topic/114998   says "

  1. Create a text file with a UDL extension.  This is relatively easy to do in LabVIEW "

 

What do I know?  This is new to me.

 

 

Anyway, I am not quite sure what you mean.

Open up the original as a binary file. 

Write what I want to it, and LabVIEW write binary file will take care of everything?

 

 

0 Kudos
Message 3 of 8
(1,294 Views)
Solution
Accepted by topic author psuedonym

@psuedonym wrote:

http://www.labview.help/topic/114998   says "

  1. Create a text file with a UDL extension.  This is relatively easy to do in LabVIEW "

 

What do I know?  This is new to me.

 

 

Anyway, I am not quite sure what you mean.

Open up the original as a binary file. 

Write what I want to it, and LabVIEW write binary file will take care of everything?


That is not what I meant.  Let me try to be clearer.

 

As I understand it, you are trying to read/write a UDL file, which appears to be a byte-oriented file with a particular format, including possibly text data saved in Unicode (which uses 16-bit characters, with ASCII being represented by having the high byte be 0 and the low byte be the ASCII character, just what you see).  The "Text file" format that LabVIEW (and most languages) use works with 1-byte character data (ASCII, which is usually 7 bits, is called "Basic Latin" in Unicode -- adding the eighth bit gives you characters such as ½ and ° and µ).

 

As I said, your UDL file is a file of U16, unsigned 16-bit integers.  Ignoring the first FFEE character, the rest of the file is straight ASCII, but expressed as a U16, with the top byte 0.  Here's how you can go about taking Text (being sure to "include" the <CR> and <LF> that appear in the text) --

  1. Create a sub-VI called "Ascii to Unicode" that takes a String input and outputs an Array of U16.
  2. The body of this sub-VI wants to work on the individual bytes of the String and change each to a U16.
    1. Wire the String (input) Control to a String to Byte Array function.
    2. Pass this Array into a For Loop using the (default) Indexing Tunnel.
    3. Inside the For Loop, the tunnel will give you each Byte of the String, which you want to turn into a U16.  You will use the Type Cast function (found on the Numeric Palette, Conversion sub-Palette) to do this.  Go ahead and connect a Wire from the Byte In tunnel to the input of the Type Cast.
    4. Now you need a U16 constant to tell the Type Cast what you want.  From the Numeric Palette, create a Numeric Constant (it will be an I32, by default),  Right-click the Constant, choose "Representation", and make it a U16.  Now put this above the Type Cast function and wire the Constant to the upper Type Cast Input.  Wire the Type Cast Output to the right edge of the For Loop (using the default Indexing Tunnel).
    5. You have just changed your String to an array of U16.  Wire this to the Array of U16 Output indicator of this sub-VI.
  3. Now write a little Test routine.  Create a String ("Testing 1, 2, 3"), pass it into this function, take the output and wire it to a "Write to Binary File".  Click on the "Binary File" function and type ^H to bring up the Help file -- you want to turn off the default option to prepend the number of bytes in the file.  Now run this code, provide a File name of the form "Testing.txt" (you want it to have the extension ".txt" to fool Microsoft into thinking it is a text file).  Now click on the file -- you should see "T e s t i n g   1 ,   2 ,   3 ".

Bob Schor  

 

Message 4 of 8
(1,271 Views)

Great.

Thanks.

NI has a couple of places mentioning that the UDL file is a simple text file. That is where I got the notion that it was text.

 

.

0 Kudos
Message 5 of 8
(1,258 Views)
Solution
Accepted by topic author psuedonym

It is text but not ASCII text as you may assume. And it is very easy to be mistaken about this, since when you open it in Notepad (or Notepad++) it will show exactly the same information.

 

But it is not!

 

One is real ASCII (DS_temp_udl.txt) and the other is UTF16-LE text (DS_udl.txt (the Windows variant of Unicode), which is NOT ASCII! This is easily seen when you look at the binary data (Noetpad++ in Hex View or some other Hex editor). Now I can't right now open your VI so not quite sure what you are doing there, and unlike the two Notepad applications that I mentioned, most software is not prepared to transparently treat text files with the BOM (the 0xFF 0xFE bytesequence) as Unicode. LabVIEW itself does not like Unicode at all and is still a full ANSI text string application, although this is clearly not just a LabVIEW problem. The UDL file is not loaded by LabVIEW directly when you specify the UDL file path as parameter to the DB Connect VI, but that is done by the OLE DB Manager.

Rolf Kalbermatter
My Blog
Message 6 of 8
(1,254 Views)

Rolf explained the "Why" much better than I (who only knew the "What").

 

BS

0 Kudos
Message 7 of 8
(1,243 Views)

@rolfk wrote:

It is text but not ASCII text as you may assume. And it is very easy to be mistaken about this, since when you open it in Notepad (or Notepad++) it will show exactly the same information.

 

But it is not!

 

One is real ASCII (DS_temp_udl.txt) and the other is UTF16-LE text (DS_udl.txt (the Windows variant of Unicode), which is NOT ASCII! This is easily seen when you look at the binary data (Noetpad++ in Hex View or some other Hex editor). Now I can't right now open your VI so not quite sure what you are doing there, and unlike the two Notepad applications that I mentioned, most software is not prepared to transparently treat text files with the BOM (the 0xFF 0xFE bytesequence) as Unicode. LabVIEW itself does not like Unicode at all and is still a full ANSI text string application, although this is clearly not just a LabVIEW problem. The UDL file is not loaded by LabVIEW directly when you specify the UDL file path as parameter to the DB Connect VI, but that is done by the OLE DB Manager.


Don't pay attention to my VI.

It is wrong.

I need to modify per Bob_Schor's post.

0 Kudos
Message 8 of 8
(1,223 Views)