LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Differential Signal Distorted at high speeds

Hi everyone,
Thanks for your help on previous questions!!
I have written a VI which samples two analogue channels using the usual, AI config, AI Start, AI Read and AI Clear sub VI's. I am not using continuous data aquisition, rather I write to the buffer once and then read the data.
This system works well when I sample my two differential signals at scan rates of about 1000 scan/s. However, when I increase the scan rate the first channel that I log starts to behave strangely, giving wrong readings, and rising to a steady level! (Saturation???) This problem gets worse as I increase the scan rate.
I have tried swapping the order of the signals and that doesn't change anything, however, both channels will log well if I aquire
them on their own!!!
So that's my problem, my signal distorts as the scan rate increases but only when logged with another channel!!!
Both signals are differential signals, I am using a PCI-MIO-16XE-10 card which should be good up to 100Kscans/s.
Any help you can offer would be greatly appreciated.
A good weekend to all.
Regards,
Alan
Message 1 of 2
(2,607 Views)
It sounds as though your instrumentation amplifier may not have enough settling time when you increase your sampling rate. Under normal conditions, your amplifier can handle sampling up to the maximum sampling rate. However, if there are other factors in your system which require additional settling time for your amplifier, you may not be able to sample at the maximum sampling rate if these are not handled properly. A main factor that can require longer settling time is high source (sensor) impedance. E Series data acquisition boards can handle up to 1 kOhm of source impedance; for impedances above this value a voltage follower circuit should be used to decrease the source impedance. This is described in the following KnowledgeBase:

How Do I Create a Buffer to Decrease the Source Impedance of My Analog Input Signal?

High source impedance is a very common factor that requires longer settling time, although there are others as well. The following Developer Zone article discusses several different factors that affect settling time:

Is Your Data Inaccurate Because of Instrumentation Amplifier Settling Time?

Another technique to try if you are using Traditional NI-DAQ, is to manually specify your channel clock rate. If you only specify your scan rate, Traditional NI-DAQ selects the fastest possible channel clock rate (which controls the rate at which each channel is sampled during a scan) for you. However, if you need more time between samples for your amplifier, I recommend setting this rate to your scan rate divided by the number of channels to be sampled. You can manually set your channel clock rate with the interchannel delay input of the AI Config VI, which calls the Advanced AI Clock Config VI to actually configure the channel clock.

I hope this helps!

Sonya
Message 2 of 2
(2,607 Views)