LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Technical Question !!!

This question concerns BridgeView2.0 and automation in electrochemistry.

I have developed this program which drives our media, using our electronics,
to perform culometric titrations in a very short time. When I first started
the project, I had set the program to aquire data at 60Hz (60 points per
second, the default set by NI), by applying a constant current to our media
in order to titrate. As we progressed we found out that our calibration
statistics improve very much if we aquire data at 600Hz (600 points per
second) instead of 60Hz, which requires requires a preconfigured "AI
Config", "AI Start" and "AI Read". Now, once the program is started, it
presets a buffer of 3000 points (equivalent to 5 seconds real time) and
aquires data at 600 points per second.

In addition to the above we also found out that Strong and Weak solutions,
wheather they are acids or bases, have a better linearity performance if we
titrate them by appying what we call a "Current Ramp". In other words when
the program is set to titrate a Strong Acid it does not apply the maximum
current entered by the user right away, as would be done if constant current
would be applied, but it gradually increases the current with respect to
time until the maximum current is reached.

When we analyze data we are interested in looking at three parametes:
1. The Output generated by the electrochemistry (pH difference created by
the current applied)
2. The Cell Output (the output generated by the media)
3. Calculated first derivative (which gives a peak or valley), of the pH
signal output

Attached I have an Excel file, wich contains two worksheets so you can
visually see the difference between Current Ramp vs. Constant Current. If
you look at the Excel file, the "Current Ramp" gives me negative "spikes"
(which I marked with arrows in the excel file) in the first 2 seconds of the
run, due to the Ramp. It seems that the ramp does not increase the current
correctly with respect to time. It looks like it applies the maximum current
by the 1st second, and instead it is supposed to gradually increase the
current until the 4th second, where maximum current is reached. Can anyone
help me understand why is this happening and how can I fix it? If you need
me to attach the actual VI so you can look at the code and try to help me
solve this, or if you need my contact information let me know.



[Attachment StrongWeak.xls, see below]


[See first answer for additional information]
0 Kudos
Message 1 of 6
(2,624 Views)
[Attachment(s) for question]
0 Kudos
Message 2 of 6
(2,624 Views)
Hi Nikos,

I will venture a guess.

If you "zoom-in" on the data from which you are deriving the first derivative, you will notice a "miniature" version of what you see in the "constant current" case (erf fuction maybe?).
Based on this observation, I am inclined to believe that your "cell" is responding to each of the voltage steps that are being applied, and establishing a new equilibrium for each.
Each of these shifts in equilibruim gives a spike.
Have you looked closely at the "Current" drive signal that is driving the cell? How often is it being updated?

I hope this helps,

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 3 of 6
(2,624 Views)
Hi Ben,

I know and I can see where those spikes are coming from. Actually the spikes
are there because of the pH generated output. When the cell voltage
increases in such large "steps" instead of small gradual steps with respect
to time, the pH generated output will also contain larger signal differences
(in step form as well), therefore interpreted as spikes by the derivative
output calculation. My problem however is how to get rid of this or fix it.

To answer your question about how often does the "AO One Pt Channel"
updates, I would like to ask you if it would be a good idea to attach my VI
program to one of these group responses. The reason being is that the code
is quite complicated, and I think you will get a much better understanding
if you
simply look at it. However to give you a short answer, the AO One Pt
Channel Update is located within a "case sequence". When the sequence is
true, the VI applies a "Current Ramp". Within this case sequence I have
entered a "Wait Until Next ms Multiple" which I have set to 100ms. I am
assuming the channel or case sequence is updated every 100ms, correct? Let
me know if you would rather I attach the VI on my next response.

Thanks for your help.
Nikos.



"Ben" wrote in message
news:101-506500000005000000FB190000-982303670000@quiq.com...
> Hi Nikos,
>
> I will venture a guess.
>
> If you "zoom-in" on the data from which you are deriving the first
> derivative, you will notice a "miniature" version of what you see in
> the "constant current" case (erf fuction maybe?).
> Based on this observation, I am inclined to believe that your "cell"
> is responding to each of the voltage steps that are being applied, and
> establishing a new equilibrium for each.
> Each of these shifts
in equilibruim gives a spike.
> Have you looked closely at the "Current" drive signal that is driving
> the cell? How often is it being updated?
>
> I hope this helps,
>
> Ben
0 Kudos
Message 4 of 6
(2,624 Views)
Hi Nikos,

The appropriate R/C network (with cap across the cell) should smooth out the steps.
The "wait until next..." can be funny. If your code is clean and quick, it will do what you want. But if things are taking to long, you may not get back to the "Wait until..." before the system clock get to the next 100ms multiple. If this is the case, it may be 200ms (or more)between updates.
To get around this issue, I suggest you do not do AO single points.
Do a continuous output and take advantage of the hardware to control the timing.
Another suggestion would be to use another input channel to monitor the actual current. Then use an XY chart with current on the X-axis. This will eliminate the complications of timing and let you see effect vs cause.

If
you would like you can e-mail me at bar@dsautomation.com

I can't garentee I will be able to look at your code, but I can give it a try.

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 5 of 6
(2,624 Views)
I send you an email with the VI we talked about. Hope we can further work
this out.

Thanks


"Ben" wrote in message
news:101-5065000000050000000B1A0000-982303670000@quiq.com...
> Hi Nikos,
>
> The appropriate R/C network (with cap across the cell) should smooth
> out the steps.
> The "wait until next..." can be funny. If your code is clean and
> quick, it will do what you want. But if things are taking to long, you
> may not get back to the "Wait until..." before the system clock get to
> the next 100ms multiple. If this is the case, it may be 200ms (or
> more)between updates.
> To get around this issue, I suggest you do not do AO single points.
> Do a continuous output and take advantage of the hardware to control
>
the timing.
> Another suggestion would be to use another input channel to monitor
> the actual current. Then use an XY chart with current on the X-axis.
> This will eliminate the complications of timing and let you see effect
> vs cause.
>
> If you would like you can e-mail me at bar@dsautomation.com
>
> I can't garentee I will be able to look at your code, but I can give
> it a try.
>
> Ben
0 Kudos
Message 6 of 6
(2,624 Views)