I notice that when I use 100 samples to read and 1K rate, the system's behavior is different from the setting of 10 sample to read and 1K rate. I think this setting will influence the output frequency, right? Then why the output rate does not match the input rate as 1K?
What is the input singal and the output signal? Are they the signal being sampled and signal being displayed respectively?
What differences are you seeing between the input and output frequencies?
The number of samples you take should not affect the frequency displayed. This value will affect how much of the waveform you're able to capture.
However, changing the sampling frequency could affect the frequency displayed.
The Samples to Read is quite simply what it is labeled. It states how many samples you want to read/write. So if you sample at 1kHz and read 100 samples, you just read .1 seconds worth of data. Based on that, your waveforms could be totally different based on when you start to sample.
How are the DAQ Assistants set up? Are you using Finite Sampling or Continuous Sampling? Posting the actual VI will help us a lot in debugging.
I now have figured out the problem.
Yes, the input signal is the signal being sampled and the output signal is the signal being displayed.
When the sampling frequency is 1K, the output frequency is 10 times higher than the case when samplng frequency is 100.