I've created two FM demodulation algorithms in LabView (I have LabView 7.1)
Now I want to test both methods.
I calculated the proportion for real FM signal parameters: Carrier frequency, message frequency and frequency deviation.
I run FM signal with parameters:
Sampling Frequency: 5MHz
Carrier Frequency: 1MHz
Message frequency: 2KHz
Frequency Deviation: 7,5KHz
The problem is - my PC can't handle this crazy amount of data.
I'm not sure, but VI is running very very slow, seems like it's stuck.
My PC is:
1.6 GHz AMD (single core)
512 RAM DDRII, 333MHz bus
The general aim is - to be able to process close to original FM radio signal (with carrier around 100MHz, message - 20KHz and Deviation 75KHz)
I understand that for signal, there should be about 5 samples per period.
But processing 100MHz signal - requires unreal performance.
How to deal with this problem? Should I downconvert my signal or use some kind of other techniques ?
All the help and suggestions are highly appreceated.
Solved! Go to Solution.
Can you post your code? You may have multiple loops taking CPU resources, etc.
Seeing your code will give us a better idea of what could contribute to this behavior.