LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Control values converted to bit codes

Solved!
Go to solution

Dear LabView Forum,

 

I have a programming challenge that I am struggling with.   I currently have a VI that generates 1403568 flavors of waveform based on five numeric, string, and boolean input controls.   I must represent each of these combinations using a 30-bit word, and strobe it to an interfacing system simultaneously as the waveform is generated.   To operate this VI, the user selects the control values commensurate with the signal to generate and hits go.   I am aware that each signal combination could be represented by a single integer, and that integer values can be easily converted into bit codes.   The challenge is getting from the user input values to a single integer.   I could generate a 1403568x5 array containing every unique waveform combination, with each column equivalent to a control input.   I could then take the user inputs and query against the array until a have a unique row.   That row position (integer) would represent a unique waveform and could be easily converted to bit code.

 

This however seams way too complicated, and may have performance issues.   Does anyone out there have a better suggestion?   Surely I’m not the first person to translate complex conditions into the bit codes!

 

Thanks,

Zach
0 Kudos
Message 1 of 11
(4,533 Views)
Can you give a better picture of what the control values are that create your 1403568 combinations?  Something like a quasi-truth table.  Specifically what are your controls and their possible values?
0 Kudos
Message 2 of 11
(4,530 Views)
Solution
Accepted by topic author super-neuron

You are correct, a 1403568X5 array of waveforms is going to have devastating effects on your performance, especially when you search each element every time you generate a waveform.  Here's a simpler way: 

 

Take your inputs, convert them each to an integer.  No matter what type of input it is, you can create a unique number for each of them.  If you have 100 different valid string inputs, it is a bit more difficult to convert this than it is to convert a boolean or a numeric.  However, since you are generating a waveform based on this input, you must have some way of using this input to get a unique output.  A simple example; if your string selections can be sin, triangle, toggle, or random, assign them values of 0, 1, 2, and 3, respectively.  Since you only have a 30 bit word to represent them, use the smallest number of bits to represent the numeric.  In this case, you only need 2 bits to represent your 4 selections.  So convert the numeric to an array of booleans and only use the bits that represent your data. Repeat this for each input, and concatenate the arrays and convert the boolean array to form your word.  I've attached a picture of a simple implementation.  There may be better ways to do this, but this is the first that came to my mind.

Message Edited by Coal Man on 12-10-2008 06:07 PM
Brian Coalson

Software Engineer
National Instruments
0 Kudos
Message 3 of 11
(4,522 Views)

Brian is going where I thought I would end up going.  The only adjustment I would make is if your strings do have allowable values to change them to rings or enumerated data types.  This way, they already have a value associated with them and you don't have to do a complicated case statement.

 

Basically, you create your own number as Brian has done and each bit has some significance.

Message 4 of 11
(4,516 Views)

Absolutely brilliant guys!   This is conceptually easy to explain to my users, and so far is proving easy to implement.   My largest numeric value only requires 28 of the 30 bits available.

 

I cannot thank you enough!

 

Sincerely,

Zach

 

 

0 Kudos
Message 5 of 11
(4,480 Views)

Dear LabVIEW forum,

 

Since this thread is similar to mine I though I'd post here. I am fairly new to LabVIEW programming, I am comming mostly from assembly/C/C++ driver related stuff. Now I find myself assigned a completely different task. I must admit I find the switch quite challenging as anything I've learned over the many years does not apply or workwith LabVIEW. It is a totally different way of thinking. Can't say it is better or worst, I just say it is quite different.

 

 

I got a bunch of check buttons and combo boxes that correspond to many options that translate into 32bitfielded value. The attached .vi is simply suppose to make the actual command parameter sent to some method.

 

I found my solution to be large and complicated and highly error prone for the simplicity of the problem.

 

Anyone could come up with something simpler ?

 

Also, In my TTLAPI_OPENCONNECTIONS_CMD_BITS.ctl file is a enum that correspond to ONE of the many enums I need to translate to LabVIEW. I've got a bunch of them, 15-20 of them. Do I need up to 20 files for all of them

Download All
0 Kudos
Message 6 of 11
(4,472 Views)

You probably should have started a new thread, but I have attached a modification of your VI.  I had to break it from your lvlib since you didn't include it.

 

Instead of using individual booleans, I replaced it with a cluster of the booleans.  On the diagram, this becomes one variable instead of x variables.  I then used a cluster to array primitive.  As long as all the controls in your cluster are the same, then you can convert it to an array.  Then, you can use the Boolean array to number to convert the array into a U32 corresponding to the binary pattern in the array.

 

For the cluster, you need to set the proper order for the array to be formed correctly. If you right-click on the frame of the cluster, you can select Set order of controls in cluster.  You will see I ordered the booleans in the order of your bit pattern for your ring values.

Message 7 of 11
(4,467 Views)

Thank Matthew K for such a quick respond!

 

I was aware of the clusters BUT how do you assign bit positions? What if recompiled on a Power PC machine? ( older Macs ) 

 

How do you set this option to be 5th or 6th bit ? What is governing which control on the panel represent which bit of the 'bitfield' ? What if i'd like the nicest presentation possible grouping stuff toghether but bits are not consecutive?

 

 

 

 

0 Kudos
Message 8 of 11
(4,465 Views)
Pay attention to the last paragraph of my previous response.  My image capture program doesn't seem to want to capture pop-up menus, so I can only describe it.  On the frame of the cluster, right-click and choose "Reorder controls in cluster".  This will allow you to set the numerical order of the booleans in your cluster.  You will actually see that your booleans are out of display order already.  It doesn't matter where they are, you basically click on each control and can change the order of the bits.
Message 9 of 11
(4,463 Views)

How could you add another 'member' of 4 bits for the 'forced' protocols ?

 

These would represent the topmost nibble of the 32 bit value *thus my combo_value * 1<<28*

0 Kudos
Message 10 of 11
(4,461 Views)