From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How does LabVIEW determine sampling intervals between channels?

I would greatly appreciate any answers.

I'm using PCI-MIO-16E-1 with LabVIEW and analog input intermediate vi's for
my DAQ applications. Say, I'm sampling two channels, 1 and 2, and set the
sampling rate to be 1000 scans/second (the default). How does LabVIEW sample
those channels?
Like this,
1 2 1 2 1 2 1...,
having equal intervals,
or
1 2 1 2 1 2 1...,
having different intervals between 1 to 2 and 2 to 1. (I got the feelings
that it's the latter case.)

In the case that is correct, how does LabVIEW calculate the time intervals
between channels?
0 Kudos
Message 1 of 6
(2,867 Views)
By default when LabVIEW is sampling multiple channels it uses the simultaneous
sampling method where the channels are sampled 1 2 3 4 .... on each scan
clock, the delay between scans is however long it takes the ADC to settle.
You can, however, program LabVIEW to use "Round Robin" sampling where the
channels are sampled at even intervals. this is done by using two Clock Configs.
The first is to set the scan clock rate and the second sets the channel clock
rate to the scan clock rate multiplied by the number of channels. By default
the channel clock goes to maximum to accomplish simultaneous sampling. If
you set the Channel Clock to twice the Scan Clock for two channels, you will
get even intervals between ch0 and ch1 giving you "Round Robin" sampling.
I hope this was clear
and helpful.

"Ekatet Intakan" wrote:
>>I would greatly appreciate any answers.>>I'm using PCI-MIO-16E-1 with LabVIEW
and analog input intermediate vi's for>my DAQ applications. Say, I'm sampling
two channels, 1 and 2, and set the>sampling rate to be 1000 scans/second
(the default). How does LabVIEW sample>those channels?>Like this,>1 2
1 2 1 2 1...,>having equal intervals,>or>1 2 1 2 1 2
1...,>having different intervals between 1 to 2 and 2 to 1. (I got the feelings>that
it's the latter case.)>>In the case that is correct, how does LabVIEW calculate
the time intervals>between channels?
0 Kudos
Message 2 of 6
(2,867 Views)
Thank you very much for your answers. However, there's something I'm not quite
sure about. By default, if LabVIEW samples all channels simultaneously, how
come I get slightly different voltage values for different channels when
I use the same input for all? For example, there are two channels, ch0 and
ch1. And the inputs to both are the same triangular waveform. If ch0 reads
6.250, ch1 might read 6.300. Is this due to other error factors?

Ekatet

"jason" wrote:
>>By default when LabVIEW is sampling multiple channels it uses the simultaneous>sampling
method where the channels are sampled 1 2 3 4 .... on each scan>clock, the
delay between scans is however long it takes the ADC to settle.>You can,
however, program LabVIEW to use "Round Rob
in" sampling where the>channels
are sampled at even intervals. this is done by using two Clock Configs.>The
first is to set the scan clock rate and the second sets the channel clock>rate
to the scan clock rate multiplied by the number of channels. By default>the
channel clock goes to maximum to accomplish simultaneous sampling. If>you
set the Channel Clock to twice the Scan Clock for two channels, you will>get
even intervals between ch0 and ch1 giving you "Round Robin" sampling.>I hope
this was clear and helpful.>>"Ekatet Intakan" wrote:>>>I
would greatly appreciate any answers.>>I'm using PCI-MIO-16E-1 with LabVIEW>and
analog input intermediate vi's for>my DAQ applications. Say, I'm sampling>two
channels, 1 and 2, and set the>sampling rate to be 1000 scans/second>(the
default). How does LabVIEW sample>those channels?>Like this,>1 2 > 1
2 1 2 1...,>having equal intervals,>or>1 2 1 2 1 2 > 1...,>having
different intervals between 1 to 2 and 2 t
o 1. (I got the feelings>that>it's
the latter case.)>>In the case that is correct, how does LabVIEW calculate>the
time intervals>between channels?
0 Kudos
Message 3 of 6
(2,867 Views)
Ekatet,
I don't know that much about this board, but I do know about DAQ in general.
The difference here is only 50mV or 0.05 volts. If the signal is say 6.25 the error here
is only 0.01 % (actualy it is 0.008% .05 / 6.25) this is pretty dang close !!

Another consideration is (if this is an A to D) is that most of them are accurate to
+/- half the LSB. If the A/D is 8 bits (0 to 256 count) and the range is 0 to 10V then
each count is 39 mV or accurate to only 0.02 V (half of 39). If the range is +10 to -10
then the count would be 0 to 128 (MSB is sign bit) then each count is about 80mv
and the accuracy is 0.04V (which is about what you are seeing).

Overall I think this board is working to the best of it's ability.

Kevin Kent

Ekatet Intakan
wrote:

> Thank you very much for your answers. However, there's something I'm not quite
> sure about. By default, if LabVIEW samples all channels simultaneously, how
> come I get slightly different voltage values for different channels when
> I use the same input for all? For example, there are two channels, ch0 and
> ch1. And the inputs to both are the same triangular waveform. If ch0 reads
> 6.250, ch1 might read 6.300. Is this due to other error factors?
>
> Ekatet
>
0 Kudos
Message 4 of 6
(2,867 Views)
First of all, the PCI-MIO-16E-1 does not do simultaneous sampling. It uses
an analog mux to switch channels to an instrumentation amplifier. Secondly,
all LabVIEW does is interface to NI-DAQ. If you want to know more, National
has NI-DAQ software and hardware manuals on line.

"Ekatet Intakan" wrote:
>>Thank you very much for your answers. However, there's something I'm not
quite>sure about. By default, if LabVIEW samples all channels simultaneously,
how>come I get slightly different voltage values for different channels when>I
use the same input for all? For example, there are two channels, ch0 and>ch1.
And the inputs to both are the same triangular waveform. If ch0 reads>6.250,
ch1 might read 6.300. Is this due to other error factors?>>Ekatet>>"jason"
wrote:>>>By default when LabVIEW is sampling multiple
channels it uses the simultaneous>sampling>method where the channels are
sampled 1 2 3 4 .... on each scan>clock, the>delay between scans is however
long it takes the ADC to settle.>You can,>however, program LabVIEW to use
"Round Robin" sampling where the>channels>are sampled at even intervals.
this is done by using two Clock Configs.>The>first is to set the scan clock
rate and the second sets the channel clock>rate>to the scan clock rate multiplied
by the number of channels. By default>the>channel clock goes to maximum to
accomplish simultaneous sampling. If>you>set the Channel Clock to twice the
Scan Clock for two channels, you will>get>even intervals between ch0 and
ch1 giving you "Round Robin" sampling.>I hope>this was clear and helpful.>>"Ekatet
Intakan" wrote:>>>I>would greatly appreciate any answers.>>I'm
using PCI-MIO-16E-1 with LabVIEW>and>analog input intermediate vi's for>my
DAQ applications. Say, I'm sampling>two>channels, 1 and 2, and set the>sampling
rate to be 1000 scans/second>(the>default). How does LabVIEW sample>those
channels?>Like this,>1 2 > 1 >2 1 2 1...,>having equal intervals,>or>1
2 1 2 1 2 > 1...,>having>different intervals between 1 to 2 and
2 to 1. (I got the feelings>that>it's>the latter case.)>>In the case that
is correct, how does LabVIEW calculate>the>time intervals>between channels?
0 Kudos
Message 5 of 6
(2,867 Views)
With an E-series board (excluding PCI-6110E and PCI-6111E) simultaneous sampling
is really psuedo-simultaneous as there is a slight delay while the amplifier
and the ADC settles because the channels are multiplexed with the same amp
and ADC. This may acount for slightly different readings on a triangle wave
measured on 2 different channels. There may also be some margin of error
ivolved in this as well. The PCI-6110E and PCI-6111E can do real simultaneous
sampling as each channel has a dedicated ADC and no multi-plexing is done.

"Ekatet Intakan" wrote:
>>Thank you very much for your answers. However, there's something I'm not
quite>sure about. By default, if LabVIEW samples all channels simultaneously,
how>come I get slightly different voltage values for different channels when>I
use the same input for all? For example, there are two channels, ch0 and>ch1.
And the inputs to both are the same triangular waveform. If ch0 reads>6.250,
ch1 might read 6.300. Is this due to other error factors?>>Ekatet>>"jason"
wrote:>>>By default when LabVIEW is sampling multiple
channels it uses the simultaneous>sampling>method where the channels are
sampled 1 2 3 4 .... on each scan>clock, the>delay between scans is however
long it takes the ADC to settle.>You can,>however, program LabVIEW to use
"Round Robin" sampling where the>channels>are sampled at even intervals.
this is done by using two Clock Configs.>The>first is to set the scan clock
rate and the second sets the channel clock>rate>to the scan clock rate multiplied
by the number of channels. By default>the>channel clock goes to maximum to
accomplish simultaneous sampling. If>you>set the Channel Clock to twice the
Scan Clock for two channels, you will>get>even intervals between ch0 and
ch1 giving you "Round Robin" sampling.>I hope>this was clear and helpful.>>"Ekatet
Intakan" wrote:>>>I>would greatly appreciate any answers.>>I'm
using PCI-MIO-16E-1 with LabVIEW>and>analog input intermediate vi's for>my
DAQ applications. Say, I'm sampling>two>channels, 1 and 2, and set the>sampling
rate to be 1000 scans/second>(the>default). How does LabVIEW sample>those
channels?>Like this,>1 2 > 1 >2 1 2 1...,>having equal intervals,>or>1
2 1 2 1 2 > 1...,>having>different intervals between 1 to 2 and
2 to 1. (I got the feelings>that>it's>the latter case.)>>In the case that
is correct, how does LabVIEW calculate>the>time intervals>between channels?
0 Kudos
Message 6 of 6
(2,867 Views)