From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Intel 8 bit Checksum

Solved!
Go to solution

 

Hi all, 

Can someone please explain what I'm doing wrong to get the correct 8 bit checksum below?

 

I'm expecting the answer to be same as this calculator 

http://www.planetimming.com/checksum8.html

 

Thanks in advance

 

 CS.jpg

0 Kudos
Message 1 of 10
(6,032 Views)

Try this KB for calculating the two's complement. I just ran it and it got the same values as the website you posted when entering in two bytes.

dK
Message 2 of 10
(5,993 Views)

Thanks for the reply Daniel. I made few changes to meet my requirement, as per the definition of checksum. But answers are not consistent.  Please see below. 

 

2.jpg

0 Kudos
Message 3 of 10
(5,982 Views)

I think I know the problem although I don't know the solution yet. 

 

Labview takes my input as a ASCI (ofcourse), where as I should be feeding it in HEX.

 

 

 

0 Kudos
Message 4 of 10
(5,980 Views)
Solution
Accepted by topic author id911

wrote:

Labview takes my input as a ASCII (ofcourse), where as I should be feeding it in HEX.


So you have a display issue.  Right-click on the input string and you can choose "Hex Display".  If you do this, I also recommend turning on Visible Items->Display Style.  You will probably want to do the same with the indicator.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 5 of 10
(5,959 Views)

Problem solved. Thank you very much !

0 Kudos
Message 6 of 10
(5,947 Views)

Hi again guys, 

 

If I still want to asci input other than changing to HEX mode from front panel, what is it that I'm doing wrong ? VI is attached too. 

Thanks in advance.

 

Capture.JPG

 

 

 

0 Kudos
Message 7 of 10
(5,908 Views)

wrote:

Hi again guys, 

 

If I still want to asci input other than changing to HEX mode from front panel, what is it that I'm doing wrong ? VI is attached too.  


First of all, your 255 (or FF) diagram constant (first post) should probably be U8 representation. Notice the red coercion dot?

 

Second, if you cast your string to a I32, the string must be 4 bytes longs (4 characters) Is it always? Does not make a lot of sense.

 

The display format of a string is just cosmetic. It does not change the bits.

 

 

Message 8 of 10
(5,906 Views)

Sorry, so what is your suggestion ? 

0 Kudos
Message 9 of 10
(5,901 Views)

I found the answer from one of your 2005 post @altenbach. 

Thanks champ !

Here is the link if anyone is looking.

 

https://forums.ni.com/t5/LabVIEW/Convert-hex-string-to-ascii/td-p/190846

0 Kudos
Message 10 of 10
(5,893 Views)