03-19-2006 07:40 AM
03-20-2006 02:09 PM
Hello Alron,
The ADC conversion time is equal to the interchannel delay. The minimum interchannel delay is the time when you are using the fastest rate your device can acquire at. If you are acquiring at a slower rate, the ADC conversion rate will also be slower so you cannot assume that the minimum interchannel delay will be the time that you need your signal to be connected.
You can check and set this rate. In LabVIEW, you would use the DAQmx Timing Property node set to the property More >> AI Convert >> Rate. If you are using the C API, you would use DAQmxGetAIConvRate and DAQmxSetAIConvRate.
If you need the conversion to take 4 microseconds I would recommend setting the conversion rate appropriately.
Please let me know if you have questions regarding this.
Laura
03-21-2006 09:11 AM
03-22-2006 05:03 PM
Hi Alron,
The PCI-6221 does not have a sample and hold capacitor. How did you determine that the voltage fell to zero during the conversion time? How did you check what the conversion time was? Did you set or read it programmatically? Please provide more details about this.
Thanks,
Laura
03-23-2006 02:10 AM
03-24-2006 10:36 AM
Hi Alron,
Do you have multiple channels in your task? If not, then the conversion time specification doesn't really mean anything. This is the time that is required for the conversion when multiplexing between multiple channels. If you are not doing this, then the conversion can happen quicker because not as much settling time is required. Is this the situation you are in?
Thanks,
Laura
03-29-2006 04:21 AM
03-29-2006 03:38 PM
Hi Alron,
I double-checked my thinking on this and it turns out that the voltage level is sampled right when the sample clock pulse is rising. That value is held in the ADC chip on a sample and hold capacitor until it can be converted. I believe this explains the behavior you are seeing and I apologize I didn't realize this earlier.
Hope this helps,
Laura
03-30-2006 01:10 AM