LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Finding the absolute (bits) ADC value from DAQ

I'm in a situation where I need to read in the raw value from the DAQ's ADC in bits.  Is there any way to do this? I'm using a PCIe-6230 DAQ.  I had thought about just working it out from the voltage based on the ADC resolution, but I can't be certain of what it's reference voltage might be, and I'd much prefer if I could find a way of accessing the output from the ADC itself to avoid potential error.

0 Kudos
Message 1 of 11
(1,449 Views)

Hi Craig,

 

there are a lot of DAQmx property ndoes. You need to use them to read configuration data for your channels including ADC raw resolution or raw measurement data…

 

The DAQmxRead functions also allows to select raw data!

Best regards,
GerdW

using LV2011SP1 + LV2017 (+LV2020 sometimes) on Win10+cRIO
Message 2 of 11
(1,448 Views)

Hm, my experience tells me that you are going down the wrong way.

 

Do you really think that measurement errors only occur because of "erroneous scaling done by the driver"? In fact, scaling is normally no source of errors, as scaling only scales errors which came up front line:

- Thermal noise

- External noise (EMC!)

- Contact and lead (including switch devices) resistances

- Wrong settling times

- Incorrect measurement method (e.g. AC/DC)

- Ground loops

- Incorrect range setting

- Incorrect connection to measurement circuit

- ...

 

There are many more and these can easily add up to several bits of your ADC being plainly wrong. So reading out the raw data will not affect accuracy and error behavior by any means.

The only situation where scaling is indeed an issue: You configured the wrong settings for your measurement. E.g. a thermal measurement using K-type thermocouple, but configuring a J-type one.

 

So all in all i dare to say that you are "putting the cart before the horse".

You should look into the specifications of your device to learn about standard measurement errors by the device. Add that up to errors you will get by your connection/signal source and then you can start doing proper error correction (statistically).

 

just my 5 cents,

Norbert

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 3 of 11
(1,429 Views)

Norbert, I didn't mention the application or reason for needing this measurement method, so most of what you said is frankly irrelevant, unhelpful, and purely based on your assumption of my requirements or application.

 

Thank you for your reply, but you entirely avoided actually answering my question.

0 Kudos
Message 4 of 11
(1,420 Views)

Uh, no reason to become unkind.

 

Gerd already explained to you how to read out raw values from DAQmx (simply an option in the DAQmx Read function), so there was no need to repeat this.

But as you stated that you want to read raw values in order to prevent errors, i wanted to point out that this is not likely to happen. It is your choice to listen to me......

 

Norbert

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 5 of 11
(1,409 Views)
And don't forget that the raw data is also uncalibrated as well as unscaled. Your rude comment to Norbert was uncalled for. Unless you get the array of scaling coefficients, your raw data is next to useless.
0 Kudos
Message 6 of 11
(1,366 Views)

Apologies, Norbert, for the rude reply, it's been a frustrating morning, although that's no excuse.

 

I don't think this data is completely useles to me though. The whole point in getting this data is to have an input comparable to a previous system, which used the raw ADC data of a homebrew DAQ and compared that against some limits to test a product. It also made noise and error measurements, so I'm just trying to recreate this as best as I can to avoid deviating from the current process.

0 Kudos
Message 7 of 11
(1,359 Views)

Ah, so you are interested in getting comparable values. Point is, that the raw data is depending on device and driver specific settings. So, even if you read raw data from the NI device, it is possible that this device is way off compared to your "old" device.

 

The best option would be to create reference tests and therefore creating a "scaling table" to manage new measurement values. If the table has only factors of '1', the new device is indeed 1:1 comparable to the old one. In this situation, you would naturally discard any scaling in your software as it is already "the right value" (read: the value you would expect!).

But if there are, for any reason, values other than '1's in the table, you have to scale each reading from the new device in your software before having "comparable" values.

 

I understand, that this requirement (having a comparable measurement value compared to an old system in the end of the day) can be very hard and nasty. And therefore, your initial question does indeed not check if previous measurement values are correct or not. They are "as is".

But you have to keep in mind: If there was a systematic measurement error in the old system, you are just trying to reproduce that error as exactly as possible.

 

Norbert

 

EDIT: Remember that devices have a thermal drift. So taking measurements up to about 30 minutes after turning on are usually off. Therefore, NI provides a feature called "self-calibrate" to counter that thermal drift. So your scaling table might also take that in account. Also, i would expect the old system NOT to have taken that into consideration. So i dare to say (assumption!) that your old values are not really comparable to one another already!

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 8 of 11
(1,348 Views)
But if the old daq used a simple scale, you would be comparing apples to oranges. The floating point values you get from the DAQmx driver are accurate and the raw values less so.
0 Kudos
Message 9 of 11
(1,347 Views)

Yeah, my plan was to use the existing system, put a product through a few times and get some values out. Then do the same with the NI DAQ and determine the scaling from that. It might not be completely identical but it only has to be within a certain range. If the last device didn't account for thermal drift then the range specified for that might be larger than required, anyway. It's not really an exact science and there are always going to be uncertainties in the measurement differences, but I think as long as I make sensible compromises it should be okay. 

0 Kudos
Message 10 of 11
(1,340 Views)