LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Quantify variance of pH meas, collaboration

There's a lot of questions regarding pH on the forum, and a lot of answers that seem to fall into two camps:

1)  Just use a standard data acquisition VI and be happy
2)  You can't get the resolution you want so buy a meter and just record the meter output via the RS 232 cable

1) is not practical if 2) is correct, but 2) assumes that the error associated with 1) makes it useless. I want to quantify the variance associated with using a pH probe connected directly to a DAQ system. I have an application where I need to monitor 20+ channels of pH.  For that many channels, it's worth my time to do a study of the variance associated with the aquisition of raw data form a pH probe via LabView to see if I can get enough resolution of the measurements for what I need to do a) without amplification, b) with amplification.  I've designed the physical experiment, but I need help with the VI implementation.  I am very new to LabView, and I don't trust myself to write a VI from scratch that will do what I need it to do for this.

I will publish my results as a white paper, and would be happy to share them with the community so that next time this comes up, there's a justification for 1) above, or a basis for 2). 

I am looking for help in either writing a VI for mxDAQ that will get me what I need, or alternatively, someone who understands Labview better could evaluate this VI:

http://zone.ni.com/devzone/cda/epd/p/id/1432

I am running DAQmx, and this VI is not supported--apparently there are a couple of sub VIs that are not available, so that is one thing that will need to be changed in order to run this VI (assuming that it's any good in the first place, somthing that I am not qualified to evaluate).

My hardware is a cDAQ-9172 with a NI 9215 BNC block.  I am connecting directly to the pH probes via BNC connections.  My setup allows for sampling from 8 probes on the test stand, plus a pair of control probes connected to an Orion 720A +.  The  Orion values could be logged in labview, but for what I am doing right now,  this is not critical. For starters, I am going to look @ ph standards of 4, 7, and 10 and quantify the variance: is it independent or a function of pH, and what can I reliably get out of the system in turms of reliable measurement?

I'd sure appreciate any help in this.  I am going to handle all the anlysis, $$, writing, etc, but I'd sure appreciate some help in selecting/building an appropriate tool for the data collection.

Thanks,

Shaun
sdustin@cc.usu.edu


0 Kudos
Message 1 of 15
(3,513 Views)
Shaun,

Typical pH probes have very high source impedances. This means that connecting them to a device which has less than very, very high input impedance results in loading of the probe output and inaccurate results. The probe impedance is not well characterized so trying to compensate in software is not likely to be very effective. I think you will want a buffer amplifier per channel at minimum.

The voltage range of pH instruments is about +/- 735 mV (I am working from memory so the exact values may differ a bit). If your DAQ device does not have a range which matches fairly closely to this, you give up resolution. You can use the buffer amplifier mentioned above to provide some voltage gain to take full advantage of the DAQ input range.

pH probe outputs vary with temperature. Some instruments have temperature sensors for each probe, while others sense the ambient temperature or require the user to enter a temperature. If all of your probes will be at the same temperature or the temperatures are held constant, one sensor or user input may be sufficient. Otherwise a temperature sensor per channel may give better results. But that means twice as many analog input channels are required.

Lynn
0 Kudos
Message 2 of 15
(3,497 Views)
Right.  I understand the issues with impedance; I've acquired some amplifiers so that I can look at the differences between amplified/unamplified signals.  I also appreciate the comment on the range of the DAQ device.  It's +- 10 V; I need to look and see if I can get a device with a lower range.  If I use the amplifier, I am less concerned with the quality of the cable and I can clip off the BNC connector and use a device that has screw/clip terminals.

Preliminarily I plugged in a probe to the device and checked the voltage output with standards of 4 and 10.  The range was 0.4 v over this span of pH.  Would you suggest that if I can find it I would be better off with a  +- 1 V given this  output?

I think the critical path is really the software at this point.  I can get hardware and work thru those issues, but a piece of software that allows me to log the data and calibrate to my standards is critical; that's where I'm stuck. I really appreciate the thoughts on impedance and range.

Shaun


0 Kudos
Message 3 of 15
(3,487 Views)
Shaun,

The +/- 1 V range would probably be good. You would be using 3/4 of the DAQ input range. That allows a bit of headroom but still gives good resolution.

I tried to do something similar to your project for a chemistry teaching lab back in the LV 3 days. That system had 16 or 32 stations with a pH probe and amplifier, a simple keypad, and a text display at each station. I think we had planned to allow a two point calibration or operation with default values for less accurate measurements. Funding issues prevented completion of that project.

Spend some time planning and designing your software. Do all channels get sampled sequentially is random access required? What is the desired sample rate per channel? Are all probes calibrated simultaneously or individually? Is the calibration automatic or manual? What is done with the data acquired? Is it saved to a file, displayed on numeric indicators or charts, or used in a process control system? Do you have one chart per channel or one or a few multi-channel charts? What kind of file format? What kinds of user interaction are required? Are there actions (such as calibration) which may only be performed by certain operators?

After getting these specifications, you can select a program architecture, then work your way to the specifics of the program.

Lynn
0 Kudos
Message 4 of 15
(3,484 Views)
Lynn,

Thanks for the input. Your application does sound similar; in some ways more complicated than mine, and in some ways simpler, but it sounds like we both wanted $700 meter capacity for $250 in parts plus the sunk cost of Labview.

The end result of this is going to be instrumentation on a set of bioreactors for my dissertation, but strictly for monitoring at this point.  I'm not trying to control pH, just observe it so that I can tell if the reactors are stable.  That's why the resolution isn't critical if I can get a linear response with acceptable errors, and why it's so important to me to quantify those errors. 

For the purposes of this effort, however, what I'd like to do to start is develop a simple VI that will allow me to make observations on the readings and plot them.  If that effort shows a linear response with equivalent variance between the three points (4, 7, 10), then that tells me that a more refined VI similar to the one I referenced above is a possibility, and I can develop a single channel pH meter VI with documented variance for the probes I want to use and allowance for callibration. This is the point where the work that will be helpful to others might diverge from what I am really after but with this core VI, people should be able to develop their own applications and have some confidence in the capabilities of the tool.

If I can't get this far then I need to look for more money and buy some pH meters, and Labview was not as valuable to the project as I had hoped it would be. But at least there would be documentation on the website of the real world challenges of measuring pH in Labview without a dedicated meter for each channel.

I am also not planning on looking at temp compensation; the variance associated with that is small relative to my allowable, and my process is a constant temp process.  Someone else might want to pick up that torch some other time.

Ultimately, I'd like to be able to record a moving average that flattens out the variance, sampled at 4 hr intervals (it's an anaerobic process and things happen very slowly; if I sample much more often than that I'll have mountains of data over the course of a 3 month experiment).  Given that, I have a lot of flexibility with how I sample--Sequential would be just fine.  The probes would (I assume) be calibrated individually and manually. I envision the data being charted  on a per-reactor basis (3 readings) and a per-replicate basis (9 readings).  I'd like user interaction (vis. me) to be minimal, but we'll see about that. Since it's a research level application, I'm not to concerned about access.  nobody cares but me (-;.  But that's all beyond the scope of what I'm trying to do with the first push, and beyond anything that will help anyone else here.  The real key to it at this point is the simple VI to see if it's possible and the documentation of the more complex VI that actually measures pH.

My big problem is that I have only had Labview Basics 1 and I've never actually developed a VI.  I have a pretty good idea of where I want to go, but I'm not real sure how to get there.  I'd really like to see if I can use the VI I pointed to above, but frankly, I don't understand most of what is going on in there and i'm heasitant to use somthing that's a black box--besides the minor point that it just doesn't work with DAQmx (2 missing/no longer supported sub-VIs).

I suppose another option would be to build a terminal block and have my 20 probes come into it and a single output to my Orion 720A+ which would then output to Labview, then have Labview control a switch that will change channels and sequentially record the data to a table.  But that would be lame, and not scalable. It would work and probably be simpler, but I don't want to go there unless this study doesn't work out.

thanks again for your interest and your input.  I'd sure appreciate any additional comments; you've been very helpful.

Shaun


0 Kudos
Message 5 of 15
(3,475 Views)
Shaun,

The best way to learn LabVIEW is to dive in and try some things.

Since you are having trouble understanding the example and cannot run it, start by looking at the DAQ examples. In the Help menu select Find Examples.. Try the acquire and graph examples. One or more of those should allow you to take some data from a probe/signal conditioner. Then look at the diagram(s) of the DAQ examples and copy and paste parts into the pH meter example until you can acquire data with that.

The pH meter example was written in LV6.1 which used the traditional DAQ drivers. It is a little heavy on local and global variables for my taste, but it should serve as a good learning tool.

If you have specific questions, please ask those on the Forum. Many participants here are willing to help. What version of LV are you using?

Lynn
0 Kudos
Message 6 of 15
(3,462 Views)
Lynn,

Thanks for the input on the existing VI and the vote of confidence.  If the basis is good then this will be a good starting point for me to jump in with. I'll give it a shot and see if I can't get it figured out.  If I can, then that VI should be a good basis for the measurement system I want to end up with.

I am using LV 8.2

Shaun
0 Kudos
Message 7 of 15
(3,458 Views)

Hello Shaun,

Thanks for contacting National Instruments. 

It looks like Lynn has gotten you off to a great start in programming your application.  I just want to add that National Instruments has tons of support information regarding LabVIEW and DAQmx on its webpage.  The LabVIEW support page has links to many popular support documents including Learn LabVIEW in Three Hours (a refresher tutorial on the basics of LabVIEW programming) and Rules to Wire By (a great resource on LabVIEW code formatting in case you ever need someone to help you debug). 

The Getting Started with DAQmx has many examples and tutorial surrounding data acquisition.  Also Learn 10 Functions in NI-DAQmx and Handle 80 Percent of Your Data Acquisition Applications provides some good information regarding the different VIs used when programming for data acquisition. 

The NI 9219 CDAQ module allows for user selectable input voltage ranges, allowing for better resolution at smaller input ranges.  If you do have questions about hardware upgrades you can contact NI directly and speak with a technical representative who can assist you in finding the right product for your needs. 

Please post back if you have any questions. 

Regards,
Browning G
FlexRIO R&D
0 Kudos
Message 8 of 15
(3,446 Views)
A proper pH meter has an input impedance of over 10^12 Ohms, that's 1000 Gigaohms.  No NI card can boast this input impedance.  They can also measure to 0.003 pH (0.15 mV) over a range of +-1V, a 12-bit resolution.

There are REASONS why a pH meter costs a few hundred dollars.

If you're measuring in a bio process, I'm presuming that your pH electrode has a thickened electrolyte.  If this is so, your resolution won't be the limiting factor of your measurements.

Look for a book "H. Galster : pH Measurement, VCH-Weinheim, 1991".  Read it from start to finish to understand where nearly everyone making pH measurements is making mistakes.

It seems like your approach will work, bot at one stage or another, you're going to realise the shortcomings.

Oh, and I've used LabVIEW to interface with pH meters a lot in the past.  Not a waste of time at all.  It can be very useful.

Shane.


Using LV 6.1 and 8.2.1 on W2k (SP4) and WXP (SP2)
0 Kudos
Message 9 of 15
(3,436 Views)
Thanks for the advice.  The difficulties you describe It speak directly to my point, however. There are a lot of applications that don't require a resolution of 0.003 pH units; in my case, if the variance allows me to say with confidence that i am within 0.3 units (wastewater bioprocess), I can define my process parameters and set high/low allarms.

In length measurement, you can use a micrometer, a set of callipers, a scale, an odometer--you can choose a tool to suit your application. In the case of pH, you can use a meter ($$$$$), or a test strip ($/2).  The data quality is a direct reflection of the $. What if I only need $$$ data?  I'm going to give this a shot and see where i end up. I want to quantify the error that comes from using an NI card for this, and publish the results so that people have a sound basis for making good decisions as to whether to take this approach or not.  That's all. And I can't afford to buy 20 pH meters; if the statistical analysis says the data isn't useful for my application, that's fine but at least I'll know what the limitations are.  And NI will too, so they can either stop saying it's possible, or add the necessary caveats to make the tools useful.  Better yet, they could build a module that would actually work for this.  they would probably sell a few of them too.

I looked for that text, and the only copy I could find was in the UK for $370 US.  Maybe they have one in the university library.

Thanks again for the input.  As far as electrode selection goes, I was planning on trying several types.  I've built a setup that will allow me to check up to 9 simultaniously, so i could do 3 types in triplicate and compare the noise they produce.

Shaun
0 Kudos
Message 10 of 15
(3,428 Views)