02-07-2019 04:34 AM
Dear,
I need to send 32 bit data packet to a slave device that requires a 8 bit CRC calculation based on Polynomial X^8+X^4+X^3+X^2+1 with init value FF. I am trying to use the Vi made by Matthew _Kelton at https://forums.ni.com/t5/LabVIEW/Computing-CRC/m-p/825331/highlight/true#M375052 but I don't get the result as expected . So for data 0xBF7F1234 with init value FF I should get CRC 8 bits = 0x01 but I getting 9B.
The polynomial should be b100011101.
Any idea how I can implement this using Matthew Kelton Vi?
Thanks in advance
Solved! Go to Solution.
02-07-2019 06:22 AM
There are MANY variations to CRCs. Do you have an example code from the device manual or something to show how the CRC was initially generated?
02-08-2019 02:40 AM
Hi,
After checking the DUT datasheet, I realized that they use an initial byte [xFF or x00] that defines whether it is a first or second frame from uC. this one is attached to DataByte to be sent to DUT, so the Matheew Kilton CRC vi works when setting "Initial CRC value"=0, "CRC Polynomial=1D" , "CRC Order=8".
My program works now.
Thanks,