I am using the USB-6009 to measure the voltage reading of a series of pressure transducers. Its my understanding that the USB-6009 has a 12bit ADC, which should afford 16384 quanitization steps. My transducer reads about 8mV/psi, which means that over my 0.5V to 4.5V (500 psi transducer) output range I should expect about 0.2mV/quantization step. This means that I should have plenty of resolution to measure my signal and have it plotted smoothly.
However, the result is that my data has discreet bands in it, that are separated by 2.54 mV, which is almost 13x greater the jump between values than I would expect (0.2mV). The results are shown below in dataBangs.png Is there something that I'm missing in my analysis of the situation? If not, what are some possible causes for this phenomenon, and how can it be remedied?
Edit: I realized after posting that I did my resolution calcs with N=14bits. For N=12bits thats 4096 steps with 0.977mV/step over my 4V range. I am still measuring 2.54mV gaps between my data points which is still a very significant almost 3x the anticipated jump.
A couple things here: looking at the USB-6009 data sheet it looks like, for differential signals you would have 14-bit resolution, so your original numbers may be correct.
As for the quantization step, those calculations should be done using the range set for the DAQ task, posting your code could be helpful here. It seems like the closest match would be that you have it set up as a differential task on the +-20V scale, which would mean 16384 steps over 40V, or 2.44 mV. While not the 2.54 you are seeing, it is close enough. So you could optimize your measurement by moving to the +-5V scale. More info here.
Now with all that said, quantization steps isn't a good measure of the accuracy you can expect as it doesn't account for various forms of errors that will be in the measurement. Take a look at the Absolute Accuracy section of the spec documents to figure out what you can expect in terms of PSI reading resolution.