04-06-2018 01:47 PM
I am trying to measure the Jitter of the 10MHz Clock (output from Ext. Ref. (10MHz Ref out)). The specification states that the RMS phase jitter is approximately less than 5ps. However when I measured the rms jitter using two techniques
1. Using Infinium Oscilloscope (EZJIT) - 66ps (Scope rms of 1.8ps)
2. Using Spectrum analyzer (direct measurement) - 66ps in the Integration BW of 10KHz to 1MHz
May I know, what does specification RMS phase jitter refer to ? (is it periodic jitter or is it total rms jitter). Why is there such a discrepency ?
04-09-2018 04:29 PM
What is the sampling rate of the oscilloscope you are using? In order to detect the jitter at 5ps it would take a very high sampling rate to detect such a small jitter. I believe the spec is referring to total RMS jitter.
04-09-2018 05:15 PM
I am using a 20Gsamples (sampling rate) to capture the waveforms on the oscilloscope. I suppose this is good to see as small jitter level.
04-10-2018 03:50 PM
From my calculations, the 20GSamples will be at time rate of 50ps per sample. This won't be able to capture a jitter of 5ps. I imagine the sampling rate would need to be above 400GS.
04-12-2018 03:05 PM
1./(20*10^9) // EngineeringForm
Your expected jitter is ~ <5 ps which is 10 times smaller than the number you can get with 20 GS/sec digitizer.