From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

PC lacks resources to process 1MHz signal. what to do?

Solved!
Go to solution

 

Hello!

 

I've created two FM demodulation algorithms in LabView (I have LabView 7.1) 

 

Now I want to test both methods. 

 

I calculated the proportion for real FM signal parameters: Carrier frequency, message frequency and frequency deviation.

 

 

 I run FM signal with parameters:

 

 

 Sampling Frequency: 5MHz

 Carrier Frequency: 1MHz

 Message frequency: 2KHz

 Frequency Deviation: 7,5KHz

 Amplitude: 1

 

 The problem is - my PC can't handle this crazy amount of data.

I'm not sure, but VI is running very very slow, seems like it's stuck. 

 

 

 My PC is: 

 

1.6 GHz AMD (single core) 

512 RAM DDRII, 333MHz bus 

 

 The general aim is - to be able to process close to original FM radio signal (with carrier around 100MHz, message - 20KHz and Deviation 75KHz)

 

 

 

 I understand that for signal, there should be about 5 samples per period.

But processing 100MHz signal - requires unreal performance. 

 

 

 

How to deal with this problem? Should I downconvert my signal or use some kind of other techniques ?

 

All the help and suggestions are highly appreceated.

 

0 Kudos
Message 1 of 5
(2,483 Views)

Can you post your code?  You may have multiple loops taking CPU resources, etc.

Seeing your code will give us a better idea of what could contribute to this behavior.

 

R

0 Kudos
Message 2 of 5
(2,461 Views)
Okey, now this problem was because my signal duration was set to 1s, it means that  1s contains 1000000 (1MHz) samples x 32-bit (signle precision data), when I set duration to 0.001s everything workes fine. But the question still left, should I perform downconversion of my input FM signal (means - shift signal to baseband multiplying by cos(wt) ) before processing??
Message Edited by ACiDuser on 06-04-2009 01:49 AM
0 Kudos
Message 3 of 5
(2,426 Views)
Are you still finding it running to slow?
0 Kudos
Message 4 of 5
(2,407 Views)
Solution
Accepted by topic author ACiDuser
Absolutely! downconvert instantly when you aquire your waveform to reduce memory needs.  Or better yet Downconvert in hardware prior to signal acquisition and you should be able to handle the data in near real time.

"Should be" isn't "Is" -Jay
0 Kudos
Message 5 of 5
(2,396 Views)