I am using an M-series PCI-6220 DAQ with one channel connected to an analog device (voltage) that is attached to a motor. The encoder of the motor is connected to a chip that converts the signal into quadrature count and direction. I am trying to use the quadrature count on PCI0 to trigger sampling on the analog device to get a graph of position vs. value. This works most of the time (some noise issues not withstanding), but when the motor is just starting or changes directions in such a way that a quadrature count pulse is jittered (quick movement back and forth across a tick pulse which can be seen on a scope), I get the following error "ADC conversion attempted before the prior conversion was completed". In the DAQ assistant, this is error #200019.
I can, of course, ignore the error. However, the read VI stops taking data as soon as the error occurs regardless. Since I am only concerned with the portion of data when the motor is moving smoothly in one direction (where this error does not seem to occur), is there a way to suppress the error so that I can continue to take data even when it occurs? The help file mentions some ability to supress the error, but does not describe exactly how.
Any help is greatly appreciated.
PS: I have already read the "What Causes Error -200019 When Increasing the Sampling Rate of an Analog Input Task with an External Clock?" knowledgebase page as well as every Voltage DAQ example VI in the Labview program. Both were about as helpful as a kick in the jimmy. I am using the latest version of LabVIEW.
If the error causes your app to stop even when you ignore it, then you'll likely need to find a feedback-loop "trick" around this issue. Are you able to read the Current Position of the motor directly? If so, AND you know when it will change directions, just stop taking data for those few steps. As far as starting glithces, just delay data sampling by a few steps.
Message Edited by Broken Arrow on 02-08-200707:42 AM
Your M-series board supports digital filtering. I haven't needed to use it myself yet so am not certain exactly how to go about it.
1. You may be able to perform a kind of digital debouncing directly on the PFI input signal before it's passed into the rest of the board's circuitry, such as the AI sampling clock. If you can do it this way, it'll be easiest.
2. Another method is to configure one of the board's counters to generate a retriggerable single pulse which is used as an AI sample clock. The total defined period of the pulse will also act like a digital filter because any additional "trigger" edges coming in the middle of the single pulse being generated will be ignored. For example, you might make a pulse with a total delay time + pulse time = 100 microsec to reject any quad clock jitter at >10 kHz.
CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW?