You are correct in how the E Series and S Series boards operate. The S Series board is one of National Instruments high end multifunction DAQ boards and thus has an ADC for each of the analog inputs. This allows these channels to all be sampled simultaneously and at faster rate.
In contrast to this, the E Series devices only have one ADC onboard. In order to accommodate multichannel sampling, the board uses a multiplexer to scan all the different channels. Therefore, there is naturally a small delay between when, for example, channel 0 and channel 2 are sampled. This delay is referred to as the
interchannel delay.
All E Series device have two clocks that are used for any analog input. The sample (or scan) clock controls when a scan is initiated, and the convert (channel) clock controls when each individual channel is sampled (the names of the clocks depend on if you are using software with Traditional NI-DAQ or DAQmx). So in order to determine the time between the sampling of channels in traditional NI-DAQ, you will need to know the convert (channel) clocked used.
Traditional NI-DAQ selects the fastest channel clock rate possible. However, to allow for adequate settling time for the amplifier and any unaccounted factors, Trad NI_DAQ adds an extra 10us to the interchannel delay (channel clock period). However, if the scan rate is too fast for Trad NI-DAQ to apply to 10us delay and still sample every channel before the next scan clock, then the delay will not be added. Also, if the user manually specifies a specific rate for the channel clock, the delay will not be added.
Using Traditional NI-DAQ, you can manually set your channel clock rate with the interchannel delay input of the AI Config VI, which calls the Advanced AI Clock Config VI to actually configure the channel clock. This information can also be found at the following sites:
How is the Convert (Channel) Clock Rate Determined in NI-DAQmx and Traditional NI-DAQ? What Is the Difference Between Interval Scanning and Round Robin Scanning?
How Is the Channel Clock Rate Determined in My Data Acquisition VI Using Traditional NI-DAQ?DAQmx behaves slightly different by selecting the slowest convert clock rate possible in order to allow for more settling time unless the user manually specifies a convert clock using the DAQmx Timing property node.
The following link takes you to a webpage that discusses the minimum and maximum values for interchannel delay and includes some links and an example that may be useful:
http://digital.ni.com/public.nsf/websearch/9AE87416C8792FC286256D190058C7D3?OpenDocument
With this information you should be able to understand and calculate what the interchannel delay will be for your specific application and for the clocks that are being used.
Regards,
Michael
Applications Engineer
National Instruments