In <378C9C5F.646A0E63@eikon.tum.de> Tobias Hermann writes:
>My problem is the following: Is it possible to handle numeric values
>longer than 32bit in LV? Specifically, one of my instrument driver VIs
>needs to supply an initialization value to the Wideband CDMA "long code"
>configuration menu of a versatile vector signal generator. This value
>can range from 0x0 to 0x1FF,FFFF,FFFF (which accounts for 41 bits, if
>I'm not mistaken). The instrument expects this value in hexadecimal
>form.
FORMING THE 41-bit STORAGE
You can create any arbitray numerical data length by using a binary
array. This is the best and most direct solution to your needs.
Just remember, LabView allocates binary arrays in multiples of bytes,
so a 41-bit array will be rounded up to a length of 6 bytes or
48-bits; however, LabView knows to return only the first 41-bits to
you upon request.
GENERATING AN ARRAY OF 41-bit VALUES
The next issue is that you need an array of 41-bit numbers. You simply
define this as a 2-dimensional array of binary values. When
initializing this array, be certain you order the dimension indices
such that the fastest moving index has a size value of 41. The slowest
moving index will then be for the actual numerical size of your array.
If you transpose the index definition, nothing will work right! Be
careful!
When your array (above) is defined, it should be shown in LabView as a
brown wire, which deplicts it as a binary 2-dimensional array. If it
is instead shown in a magenta color, then you instead defined and array
(cluster) of 41-bit binary numbers. That will work okay too (and will
even be easier the used), but it will take more storage and will take
LabView longer to handle. However, if you're more confortable will the
latter choice, I can understand that. It does have the feature of being
able to address each 41-bits number directly. With the 2-dim approach
I initially outlined, you'll need to slice out each 41-bit number from
the 2-dim array each time you access (read/write) it.
FORMATTING BINARY INTO HEX
Formatting your storged numbers is an entirely _separate_ issue and
your 41-bits numbers only need to worry about formatting when you
send them to an output device. For a LabView control panel, simply
open a binary array control, then select the appropriate options so
they are displayed in base-2, octal, or hex. LabView supports all
three display formats with any binary indicator.
If you need to format your numbers to a hex string (for an output
device), try the "To Hexadecimal" or "Format & Append" function.
Again, if you're using the 2-dim binary array apprach, you'll need to
slice out each 41-bit number indivdually. If you're using the cluster
(array) if 1-dim 41-bit array approach, you can skip that step.
Frankly, I'm not sure how well LabView's formatting functions work with
1-dim binary arrays, but in principal they should. If they don't work
directly, no big deal. Just cast the 41-bit arrays into U8 numbers
(bytes) and have the formatting function format those numbers into hex
strings; simply one more step.
One question: Why on earth do you want to format the data going
into your instrument in the first place? Almost all instruments will
take binary input _directly_ without formatting it into ASCII (octal,
hex, etc) first. This is much faster because the ASCII of all this
hex stuff requires many more bytes to transfer to your instrument.
Moreover, the instrument then needs to convert the ASCII (hex) back
into binary before it can use any of it. You should investigate
skipping the formatting step altogether. The exception might be
if you're using an RS-232 connection to your instrument where raw
binary isn't allowed. Newer interfaces like GPIB, 1394, etc will
allow binary transfer directly.
COMMENTS ON STRINGS
Don't use strings in LabView. By default, each string get a minimal
memory allocation of 2K. If you define a 100 element (array) string,
that's 200K of memory just to hold those strings. Not a big deal, but
when the memory manager starts doing garbage collection (which is
required with string manipulations), things will really slow down.
String are important, but data should always be stored in a binary
form within any computer, not in ASCII formatted form (hex included).
--
/\ Mark M Mehl, alias Superticker (Supertickler to some)
<><> Internet: mehl@IAstate.edu
\/ Preferred UUCP: uunet!iastate.edu!mehl
Disclaimer: You got to be kidding; who would want to claim anything I said?