LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

data acquisition

hello

i am confused by the features in the 'daq assistant'
understanding the theorem of nyquist sampling theory,i notice that it has another features in daq assistant of labview. it is required to enter the sample. entering the sample, will it affect its sampling frequency. for example, sampling rate 1000Hz, sample is 1000. it means that it takes 0.001s to acquire 1 sample. but if let say, sampling rate is 10kHz, sample is still 1000.what i can analyze is that it takes 0.0001s to acquire 1 sample due to the new sampling rate. the higher the sampling rate, the better resolution it is. what is confusing me is that if i put sample 10000 sample with sampling rate 10 kHz, it still takes 0.0001s to acquire 1 sample( pls dont hesitate to enlighten me if i am wrong). but with increasing sample (1000 sample increase to 10000sample), is there no relation?

i am using e series (PCI 6036 E) and usb 6008 to compare the performance of acquiring signal. signal is in the form of analog voltage. here my 2nd question goes. the voltage ai0 and ai1 were acquired. the sampling rate is 1000 Hz. sample is 1000 sample. at different condition, it is required that sample can be alternatively changed. (is it possible)? it sounds weird. it is so because it may appear the condition that within this 1000 sample, sometimes it appears 4 curves to be analyzed. sometimes 3, sometimes 2. but if i put high value of sample, it cannot differentiate 4 curves from 2 consecutive of 2 curves. and the data becomes errant. does it relate to algorithm i developed?anyone pls help


is it advisable to use daq mx or daq assistant in the case i stated above?
thanks for the help
regards





0 Kudos
Message 1 of 2
(2,636 Views)
It's really pretty simple. Sample rate is how often the DAQ board takes between samples. Number of samples is how many of those samples you want to get from the DAQ board. Changing one does not affect the other. You have to take at least two samples though. If you request only one sample, then you are using on-demand sampling which is software timed - often your program loops. If you asked for two samples at 1kHz, there would be 1 msec between the first and second sample. If you asked for two samples at 10kHz, there would be .1 msec between the first and second sample. Sample rate has nothing to do with how long it takes to acquire the first sample. As long as you request more than 1 sample are are using continuous sampling, the time between samples will be very precise.
Message 2 of 2
(2,631 Views)