# LabVIEW

cancel
Showing results for
Did you mean:

## problem facing on conversion of decimal number to binary number

I tired the circuit shown in first figure below using a for loop. It works well for well till the input is 511 as shown in second figure but when the number is increased above it, or iteration are increased it results incorrectly as shown in third figure

1.

2.

3.

Message 1 of 11
(541 Views)

## Re: problem facing on conversion of decimal number to binary number

• It is not a "circuit", but a picture of block diagram. We cannot run pictures, so attach the VI instead.
• There should not be any orange anywhere
• You need a while loop because you don't know the number of iterations needed.
• You are not converting to binary, You still have a decimal number, but are only using the digits 0 and 1. That's not the same value as the input.
• You should clean up your code instead of having wires going in all directions
• Whatever you are trying to do will fit on a quarter postage stamp. It can be done in one step If you set the integer indicator to binary display format, all you need is a wire.

Message 2 of 11
(538 Views)

## Re: problem facing on conversion of decimal number to binary number

@altenbach wrote:
• You are not converting to binary, You still have a decimal number, but are only using the digits 0 and 1. That's not the same value as the input.

As a simple programming exercise, here's something similar that takes an integer and converts it to a decimal number that looks the same as the binary representation of the input. Seem silly!

There are many other ways to do that, though...

Message 3 of 11
(510 Views)

## Re: problem facing on conversion of decimal number to binary number

I attached my vi below , how should not be any orange anywhere

Message 4 of 11
(507 Views)

## Re: problem facing on conversion of decimal number to binary number

Tried the solution in LABVIEW 19.0 Version, but output is not as expected

Message 5 of 11
(457 Views)

## Re: problem facing on conversion of decimal number to binary number

Just another solution:

Message 6 of 11
(442 Views)

## Re: problem facing on conversion of decimal number to binary number

@sasimitha wrote:

... how should not be any orange anywhere

Because you are dealing with integers exclusively. There are no fractional bits. Your lower shift register is I32 and after multiplying with 10 N times, you are hitting the ceiling.

Message 7 of 11
(428 Views)

## Re: problem facing on conversion of decimal number to binary number

@sasimitha wrote:

Tried the solution in LABVIEW 19.0 Version, but output is not as expected

(Also note that I was using U64, not I64, but that should not make a difference. What is the datatype of the diagram constants?)

Message 8 of 11
(426 Views)

## Re: problem facing on conversion of decimal number to binary number

Note that even my solution will not work if the input is large, because even U64 can only have about 20 decimal digits or less, but you can have up to 64 bits in the input.

As long as you stay withing these limits, even this will work:

Message 9 of 11
(408 Views)

## Re: problem facing on conversion of decimal number to binary number

Do you want it as a number or as a string of 0's and 1's? If string, you can do like this:

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Message 10 of 11
(398 Views)