USB 6215 DAQ board uses16-bit analog to digital conversion. Is it possible here to set lower bit conversion e.g. that DAQ board works only with 12-bit A/D conversion on particular input analog channels. Thanks in advance for your help.
Could you please let us know why you would like 14 bit resolution, and why 16 bit would represent an issue?
I want change conversion resolution because I want to use 3-parameter sine-fitting algorithm to check A/D. Actually, I want to show how A/D resolution may affect measured signal in frequency domain. I know, sine-fitting methods are based in time domain, but also the same acquired signal I want to represent in frequency domain at different A/D resolution. But I seems to me that thica can not be set as easy way or not? Thank you for your replies on this.
Have a nice day,
The resolution of the ADC cannot be phisically changed, as it is in HW.
On the other hand , the SW component is more flexible, so you could transform your 16 bit signal into 14 bit.
The easiest would be to make the two LSBs 0.
If I might have a recommendation, if you use LV for sine-fitting, have in mind the vi: Extract Single Tone Information.
Here is a link to a shipping example that shows how you can convert a DBL into a calibrated, unscaled binary number. You'll probably want to modify it to use 16-bit numbers instead of 32-bit.
You can then use LabVIEW's And Function (it performs a bitwise AND on numeric inputs) to mask out the last two bits like Mircea suggested. Assuming you modify the example to use 16-bit integers, you should use 0x FFFC (1111 1111 1111 1100) as your mask to set the last two bits to 0.