Dear NI users,
I'm somehow lost with a some weird result I get by changing the sampling rate of one of the acquisition boards we have at our lab.
Electrical configuration: there are two setups, one that has the Multifunction I/O Device PCI-6361, while the other has an Oscilloscope Device PCI-5922. With both, I read the output of a waveform generator configured to output a white noise signal (1 V offset, 10 mW amplitud, 100 kHz bandwidth). The reading is performed with BNC cables, sequentially, first one board and then the other. 50 Ohm load is in parallel to the input of each board.
Acquisition: at different sampling rates, recording for 1 s, in the 2 V range.
Observables: analysis of the signal is quite simple. I take the Power spectrum, the Mean value and the Standard Deviation of the whole trace.
Issue: at different sampling rates...
- Multifunction PCI-6361: Power spectrum looks reasonable, Mean value reasonable, Std dev does not change with sampling rate (as expected, right?)
- Oscilloscope PCI-5922: Power spectrum looks reasonable, Mean value reasonable, Std dev does change with sampling rate (not expected, right?)
1) Why the Std dev is changing with the sampling frequency?
2) Is the oscilloscope board sampling at higher rates and averaging the points?
3) The Oscilloscope board has an Alias-Free Bandwidth that is 0.4 x Sampling rate. Maybe the noisy components of the white noise signal are coming at high frequencies and above the Alias-Free Bandwidth all those components are attenuated and "killed" so I get a reduced Std Dev? Could it be?
Any help is really appreciated. Thanks a lot!
Solved! Go to Solution.
the 6361 is a multiplexed ADC card, no bandwidth limiter (up to your generator or 3.5 Mhz see spec) , so ALL noise in that range is folded and found in your data 🙂
the 5922 is a (tricky) deltasigma, the bandwidth is limited by the output decimation filter defined by the samplerate,
How does it look like if you create a bandlimited noise within Niquist and compare the results.