11-22-2013 12:02 AM
I am trying to implement the Manchester decoder in labview based on this algorithm.
Algorithm can be found here:
http://www.cypress.com/?rID=55345
I am having difficulty in implementing the 3/4th bit--time delay.
I am fairly new to labview programming, so I would greatly appreciate any suggestion.
11-22-2013 04:42 AM - edited 11-22-2013 04:42 AM
Hi kulsman,
what did you program so far?
How are you reading the digital values in? Do you use external triggering or do you read the DI with a fixed sample rate?
Are you trying to detect edges in the signal?
Btw. you can also decode the bits by detecting any edges (rising/falling) in the signal and using the negated value just after the detected edge. No need to wait for "3/4 bit delay"...
11-22-2013 10:15 AM
Hello Gerdw,
This is my basic setup. [photodiode -> amplifier -> c-RIO]
I was think of connecting a analog input module to the c-RIO.
As far as the code,I have just started last night. I created a while loop so that it is constantly running, and I started to look into the DAQ.
How do you detect any edges (rising/falling) in the signal? Is this part of special Labview module?
Thanks,
Rahul Madhu
11-23-2013 03:09 AM - edited 11-23-2013 03:09 AM
Hi Rahul,
- cRIO don't use DAQ(mx), they use the ScanEngine or the FPGA...
- EdgeDetection as easy as it is...
11-23-2013 10:47 AM
What I have done in a similar situation is oversample the waveform. So if the data is supposed to come in at 1MHz, I would sample at something more like 4MHz or 8MHz. Since you are looking for a 3/4 bit time, the rate should be divisible by 4x the bit rate. Then you can easily figure out where the transitions were.