I have a question regarding the connection between sample rate and settling time when using multichannel aggregate sampling. For most devices, there is one single value specified as "Multichannel maximum (aggregate) sample rate", e.g. 1 MS/s. As far as I understand, this would mean that no matter the number of used channels (N), I can achieve a sample rate per channel of S = (1MS/s) / N.
But now, there is also settling time. E.g. for the PCIe 6363, using a range of +-100 mV, a settling time for 2 uS is specified when accepting an error of 60 ppm (or even 8 us for 15 ppm). But assuming a settling time of 2 uS, I should not be able to achieve an aggregate sample rate higher than 500 kS/s (the inverse of 2 us). In the x series user manual it is even specified that "With NI-DAQmx, the driver chooses the fastest conversion rate possible based on the speed of the A/D converter and adds 10 μs of padding between each channel to allow for adequate settling time." With this the speed would be limited to 100 kS/s.
Does this mean the value specified as "aggregate sample rate" assumes an optimal value of settling time of 1 uS? And for real measurements in the range of +-100 mV, I will not be able to achieve this rate, but a lower rate depending on the settling time at my specific input impedance and range?