09-17-2007 08:26 AM
09-17-2007 12:33 PM
Hi sparckis-
M Series cards are in fact capable of performing interval scanning. I suspect that you might have run across some old info about the way early versions of NI-DAQmx defaulted to using round robin scanning. The most up to date information is available here. In short, NI-DAQmx 7.4 and later by default chooses a convert rate equal to the maximum convert rate available on the board (typically, the fastest single-channel scan rate) plus 10uSec of padding for settling between channels. The only exception is when the 10uSec time won't fit, and then the convert rate will default back to round robin scanning.
Of course you also have the option to override the default NI-DAQmx convert clock rate. Since you mention 63 channels at 250kHz, I'll assume you're using a PCI-6225. If you want to scan as quickly as possible you can adjust the convert rate to 250kHz and the behavior of the device will be as you described (one sample clock pulse every 1/2000 seconds which triggers a train of 63 convert clock pulses spaced 1/250k seconds apart). However, to achieve the accuracy specifications of the device you need to (at least) allow for the 7uSec settling time prescribed by the specifications for a worst-case full-scale step settling of 1LSB.
Hopefully this helps-
09-17-2007 01:58 PM
09-17-2007 02:42 PM
Hi sparckis-
After reading through the link you posted, I am even more confused on how the old card worked. The 6061e had a sampleRate of 250,000.0 and a scan rate of 2000 over 63 channels. As far as I can tell, this defaulted to Round Robin, is that correct?
Yes. In order to fit 10uSec of padding on top of the 1MHz maximum update rate of the 6071E you would have to operate the convert clock rate at ~91kHz. For 63 channels, that rate would correspond to a scan period of 63 / 91k = 0.7mS. A scan rate of 2000Hz requires a scan period of 0.5mS, so the convert clock would need to run faster than what would be required for 10uSec. In the case of DAQmx, this means that round-robin sampling would be used, and that it would be faster than 10uSec padding but slower than the maximum possible rate.
To [hopefully] sum up the rest of your questions, in order to get the same behavior (as before with Traditional NI-DAQ) you need to set your AI.Conv Rate to 250kHz. According to the 6225 specs, a settling time of 4uS (corresponding to a convert rate of 250kHz) would result in a full-scale step accuracy of +-6LSB (worst case). The 6071E specs say that for a settling time of 3-5uS you can expect an accuracy of +-0.5LSB. Keeping in mind that the 6225 is a 16-bit board and the 6071E is a 12-bit board (where the width of a 16-bit LSB is 16x "smaller" than a 12-bit LSB) I would expect better accuracy on the 6225 than you saw on the 6071E.
The M Series device is physically able and allowed by DAQmx to sample at the 250kHz rate, though the overall accuracy might suffer slightly. This ability is the motivation for spec'ing the minimum convert interval to achieve 1LSB (i.e. 16-bit) settling in addition to the spec for accuracy at the maximum scanning rate.
09-18-2007 03:09 PM - edited 09-18-2007 03:09 PM
After reading through the link you posted, I am even more confused on how the old card worked. The 6061e had a sampleRate of 250,000.0 and a scan rate of 2000 over 63 channels. As far as I can tell, this defaulted to Round Robin, is that correct?
Yes. In order to fit 10uSec of padding on top of the 1MHz maximum update rate of the 6071E (I used the 6061e) you would have to operate the convert clock rate at ~91kHz (or 83.33kHz for the 500kS/S on the 6061e). For 63 channels, that rate would correspond to a scan period of 63 / 91k = 0.7mS (or 63/83.3 =7.6mS) . A scan rate of 2000Hz requires a scan period of 0.5mS, so the convert clock would need to run faster than what would be required for 10uSec. In the case of DAQmx, this means that round-robin sampling would be used (What about in the case of the traditional NI-Daq driver, was the card using round robin or interval scanning?), and that it would be faster than 10uSec padding but slower than the maximum possible rate.
To [hopefully] sum up the rest of your questions, in order to get
the same behavior (as before with Traditional NI-DAQ) you need to set
your AI.Conv Rate to 250kHz. According to the 6225 specs, a settling
time of 4uS (corresponding to a convert rate of 250kHz) would result in
a full-scale step accuracy of +-6LSB (worst case). The 6071E specs
say that for a settling time of 3-5uS you can expect an accuracy of
+-0.5LSB. Keeping in mind that the 6225 is a 16-bit board and the
6071E is a 12-bit board (where the width of a 16-bit LSB is 16x
"smaller" than a 12-bit LSB) I would expect better accuracy on the 6225
than you saw on the 6071E. (would this still be true based on the correct old card? Could you compare the Scan_to_disk function and verify that those settings are 1-1 with the Ni-daqmx settings?)
The M Series device is physically able and allowed by DAQmx to sample at the 250kHz rate, though the overall accuracy might suffer slightly. This ability is the motivation for spec'ing the minimum convert interval to achieve 1LSB (i.e. 16-bit) settling in addition to the spec for accuracy at the maximum scanning rate.
Message Edited by sparckis on 09-18-2007 03:09 PM
int main(){
TaskHandle taskHandle=0;
int32 read;
int Gain = 10;
int SampleRate = 2000; //Changed in database and setup screen
float Period = prescan+period;// Default 1.2= 0.2 + 1
int SamplesPerChannel = (int)(Period * SampleRate); //Default = 1.2 * 2000 = 24000
int Sample_array_size = SamplesPerChannel * 63; //63 is number of channels
float64 data[Sample_array_size];
/*********************************************/
// DAQmx Configure Code
/*********************************************/
DAQmxCreateTask("",&taskHandle));
DAQmxCreateAIVoltageChan(taskHandle,"Dev1/ai0:ai63","",DAQmx_Val_RSE, -1*Gain,1*Gain,DAQmx_Val_Volts,NULL);
DAQmxCfgSampClkTiming(taskHandle,"",SampleRate,DAQmx_Val_Rising,
DAQmx_Val_FiniteSamps,SamplesPerChannel);
DAQmxStartTask(taskHandle));
DAQmxReadAnalogF64(taskHandle,SamplesPerChannel,
10.0,DAQmx_Val_GroupByChannel,
data,Sample_array_size,&read,NULL);
}
Message Edited by sparckis on 09-18-2007 03:26 PM
09-19-2007 07:04 PM
HI sparckis-
Sorry for the confusion there. I looked back at the AT E Series specs and the accuracy numbers look the same, so my earlier comments should still hold true.
The default behavior in NI-DAQ Traditional if you only specified the scan rate would have been 10uSec if it fit (which it wouldn't for your case), or round-robin if not (which would have been the default for your case). Of course, since you were setting both the scan/sample clock rate and channel/convert clock rate then the default behavior is not applicable.
In short, your settings for 2k sample clock and 250k convert clock in DAQmx will give the same behavior you had in NI-DAQ Traditional with the Scan_to_Disk function and your parameters.
09-20-2007 09:26 AM
12-20-2007 05:25 PM
12-20-2007 05:39 PM
Hi sparckis-
Yes, your calculations seem correct. Since you mention 63 AI channels and M Series, I assume you're using either an NI 6225 or NI 6255 device. You should be aware that a 4uSec convert rate interval may sacrifice some accuracy due to the required settling time(s) for multi-channel scanning.
For instance, the NI 6225 device specifications call for a 7uSec convert interval for +-1LSB accuracy in response to an input full-scale step. A 4uSec convert interval would an approximate full-scale step response accuracy of +-6LSB. Alternatively, the NI 6255 specifications prescribe settling times of 1.6uSec to 2.5uSec for +-1LSB full-scale step accuracy on most ranges.
So, to achieve the accuracy you are looking for you might need to tweak your convert rate slightly, depending on the device you're using.