From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Simple VI for old gas analyser

Hi Guys,

I’m a complete newbie and I know this is a big ask but I’m trying to resurrect an old gas analyser. I need to write a small Labview  2014 program to log the outputs and I need some help to get started.

 The analysers in the original life had four chart recorders attached and I can picked off the eight signals (suitable hardware will be bought if I can get this to work)

 

 

Each of the four analysers outputs 2 signals. Signal 1 is a voltage (0-10) that represents range as shown  below

 

Range

Low voltage

High Voltage

1 (4 ppm)

0.6

1.4

2 (10 ppm)

1.6

2.4

3 (40 ppm)

2.6

3.4

4 (100 ppm)

3.6

4.4

5 (400 ppm)

4.6

5.4

6 (1000 ppm)

5.6

6.4

7 (4000 ppm)

6.6

7.4

8 (10000 ppm)

7.6

8.4

 

The second signal is a 4-20 ma  which represents where you are within that range.  

This should give four outputs between 0ppm and 10000ppm and these signals then pass through simple K factor multiplier (ie divide by X) to give a corrected figure for the concentration of gas measured

Ideally the front panel would have two digital displays per channel (range and ppm) and one box to add the K factor.

The range + ppm + K factor would then be written to a suitable file on the hard drive.

Can anyone help with this 

0 Kudos
Message 1 of 14
(4,056 Views)

Hi bongo,

 

as a newbie you should take the beginner courses offered for free by NI on their website…

 

That being said: what have you tried so far? Can you attach your VI? (Nobody will do your work for you unless you hire them…)

 

- When you "round to nearest" your voltage you get the range. Use this to index an array of range factors.

- Convert your current reading to a value: current-factor = (current reading - 4) / 16 [all values in mA]

- Then multiply range factor and current factor to get the final result…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 14
(4,051 Views)

Hello Bongo_ben,

 

If I understand correctly you want someone to create the VI you need for you. I am sorry to inform you that, as GerdW implied, this is something we do not do. 

 

I do however have some links to help you get started.

 

Example of free training regardless of license and service contract status: 

http://www.ni.com/webcast/2613/en/ (found at ni.com or by searching with google).

 

http://www.ni.com/getting-started/

http://www.ni.com/training/

http://www.ni.com/labview/buy/

 

I would also like to mention that with a LabVIEW license and service contract you can do courses online for free. When you buy LabVIEW it includes a 1 year service contract. With these courses I have no doubt you will be able to create a sufficient VI. And if you still struggle after making a real attempt, we are here to help.

 

Best regards,

Jonas.H

0 Kudos
Message 3 of 14
(3,970 Views)

Adding to what's been said, here are a few more suggestions.  I'm guessing that the original recorder had an 8-position Range switch and drove a pen motor with the current, with 4 ma being the "low" reading and 20 ma being "full scale".  You are probably going to want to convert these currents into a voltage, say by measuring the voltage across a 500 ohm resistor (4 ma = 2 V, 20 ma = 10 V).

 

An inexpensive device from NI, the NI USB-6009, has several analog input channels, some (low-current) analog outputs, and digital I/O ports.  I believe it can sample at 1KHz (maybe even faster, but that should be more than adequate for your task), and it easily "talks" with LabVIEW through a USB input on your PC.

 

If you have one and want to get started collecting data from it, there are a number of good tutorials on DAQmx out there.  However, the simplest way to start with any device in LabVIEW is to use MAX, the Measurement and Automation Explorer.  You plug your device in (you need to install NI Device Drivers if you haven't done so already), MAX finds your device, and opens Test Panels to it.  With the 6009, you can simply say "Start acquiring analog data on AI0 at 1KHz" and it will happily do so, plotting the data so you can see it.

 

Bob Schor

0 Kudos
Message 4 of 14
(3,905 Views)

Hi Bob,

Thanks for the practical advice, I found a 6009 kicking around in the back of the lab together with a DAQ 9188 chassis and a NI9211 thermocouple module.  I have had a play with these in the past and managed to import display and record temperature data.  

 

My original intent with the post was just to get numbers on a screen to keep management quite for a few days while I got my head around a new language.

 

It didn’t work out that so I had to fall back on DasyLab and an old IOTech Data Shuttle.  If it’s of any interest I’ve included a few screen dump of the work sheets. As you can see I used  timeslice function for the range scale function then bundled in in a black box,  to this was added  inputs displays and scaling modules.  This has proved successful and the analysers are performing as expected despite the obvious damage done by HF.

 

The main issue I was having with labview was how to replicate to slice function for range v scale and is there a multi input calibration (and of course time scale)

The next stage is to replace the shuttle with something modern and integrate the system to the emergency exaction alarms etc.

 

The problem here is I’m a chemist not a programmer and its possible NI is not the right tool for the job .

 

Thanks for you practical input

 

Regards

Ben

Download All
0 Kudos
Message 5 of 14
(3,869 Views)

@bongo_ben wrote:

 

 

The problem here is I’m a chemist not a programmer and its possible NI is not the right tool for the job .

 

 

Regards

Ben


As a scientist, you should define the problem better! (Smack once on the left cheek!) The issue I heard from your post is: "I'm not sure the tools I have are correct to produce results that prove or disprove an hypothesis."

 

If all you have ever used as a tool is a hammer, you are not going to make the best use of a screw!

 

So, in "The junk yard" you found some stuff like a 6009, and a thermocouple?  Would it it be nice if you defined the question, then proposed an hypothesis, then (and only then) explored methods to disprove that null question?  After you know your experiments requirements, we might help you construct your schoolwork.... But, I think I just gave you the better advice.


"Should be" isn't "Is" -Jay
0 Kudos
Message 6 of 14
(3,842 Views)

Based on the black box, DASYLab is scaling each input based on the value of a channel, in the image it's channel 0, to determine how to scale the next channel. 

The time slice function is a selector... .based on the value of the data at the X input, it will use the value of one of the input channels and output that scaled channel.

 

Pretty creative, but probably could have been done using Scaling with a table of reference values  (piecewise linear scaling). I'd have to look at the logic in the actual worksheet to determine that. 

 

In programming language - it's a case statement - that may help you LabVIEW guys figure out how to help him. 

 

Failing that, I'm more than happy to talk to you about how to update DASYLab to the latest release and use it with your National Instruments device. DASYLab continues to be a fully supported product. 

Measurement Computing (MCC) has free technical support. Visit www.mccdaq.com and click on the "Support" tab for all support options, including DASYLab.
0 Kudos
Message 7 of 14
(3,834 Views)

Great.  Here's how to get started:

  1. Get two resistors (for now, the value doesn't matter, say 10K -- we'll get to that later).  Place one across pins 2 and 3 of the 6009 (AI0 and AI4), the other across pins 5 and 6 (AI1 and AI5).  It helps if you have the screw terminal strips plugged in.
  2. Plug the 6009 into your LabVIEW PC.  Did your PC "recognize" it as a 6009?  If not, you'll need to install the NI Device Drivers.
  3. Start NI MAX (Measurement and Automation eXplorer).
  4. Open Devices and Interfaces, select the USB-6009, and choose Test Panels.
  5. On the Analog Input tab, choose AI0, On Demand, Differential (meaning it uses AI0 and AI4 as the inputs), and press Start.  You should see a trace that hovers right around 0.  If you touch one side of the resistor plugged in to that port, the noise level should increase (you are acting as an antenna for 60 cycles.
  6. Try the same thing with AI1.
  7. Experiment.  See what Finite and Continuous modes do (you'll need to play with Rate and Samples to Read).

Everything you've just done in MAX you can do in LabVIEW quite easily, especially if you create a Task in MAX (see the Create Task button next to Test Panels?).  You'll need to use three or four DAQmx functions -- DAQmx Start Task, DAQmx Read, DAQmx Stop Task, and DAQmx Clear Task (I strongly urge you to avoid using the DAQ Assistant), wiring into Start Task the name of the Task you create in MAX for your two input channels.

 

Now you need to chat with someone who knows more about electronics than I as to what values of resistors you should use for the Range and PPM outputs.  I'd mentioned 500 ohms for the latter, but Wiser Minds might suggest building a small circuit.  The other thing to think about is whether or not the instrument outputs things on a linear or logarithmic scale (just so you scale your plots properly).

 

Now it is just a matter of designing what you want to do.  It has already been mentioned how you could change a voltage reading on the Range input into one of eight fixed scales.  Think about Front Panel design (you won't need a Range switch, since it is being provided for you, but may want a Range indicator -- how do you want it to look?  How often do you want the display to update?  Do you want it to scroll, like a pen recorder?  How fast?  How often should you be sampling the data?).

 

The Tutorials should get you familiar enough with LabVIEW to start making headway on this.  I think you'll find that LabVIEW is well-suited to this task, as it (a) has the ability to take "real-world data", (b) has the concept of "time" so you can take "samples" at defined intervals, (c) has an intuitive means of designing a GUI with meaningful controls and indicators, and (d) has an intuitive graphical way of designing "code".

 

Bob Schor

Message 8 of 14
(3,804 Views)

My EE colleague says that you should be fine wiring Range directly into AI0/AI4 and using a 500 ohms resistor across AI1/AI5 and wiring the two current inputs to those same ports.  You'll also want to bring a Ground connection from your Analyzer to 6009 Gnd (I believe Pin 1 is a Gnd).

 

BS

0 Kudos
Message 9 of 14
(3,772 Views)

Hi Bob,

Sorry about the delay in replying but things at work are moving very swiftly. As I alluded to in the original post the simple vI was only to demonstrate linearity, stability and response of the instrument. The DasyLab base program has achieved this and the customer is keen move on to the next phase. (oh the joy of Gate Meetings ).

With both NRC and consumable budgets now on partial release I’m under pressure to raise purchase req’s for hardware. So I’m seeking your advice, The gas analyser uses 8 channels but there will be additional instruments required. I’m working on an original estimate of 42 channels in total but this might be increased as customer requirements change. InitiallyI was looking at a pair of NI cDAQ-9188 (because I have one already) populated with NI9211 NI9215 and NI9203 modules as required but I’m unsure if these are compatible with Dasylab (yes I do appreciate the irony of that on a Labview Forum but I need a fall back plan). Alternatively one of my colleagues swears by Delphin expert key which claims to be compatible with Labview, Dasylab and their own software. This would allow the project to proceed in Dasylab with the option of using Labview if required at a future date.

Before others start yes, I do appreciated there are some superb sub-contract programmers out there but just as a someone in my position fails to understand the subtle nuances of Labview as a language and is therefore unable to full specify what required, the situation is also mirrored by the programmers not fully understanding what we’re trying to achieve and the customer moving the target. I speak from experience, some projects are best left in house ( unless you’re really keen on being phoned at 4:30am just to change a line of code) Anyway I seem to have gone off on a tangent – any experience, advice or recommendations you can offer on hardware would be appreciated. Now the immediate pressure is off I can work through your instructions and recommend background reading at leisure.

Kind Regards Ben

0 Kudos
Message 10 of 14
(3,695 Views)