LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Problem with...

Hello eveyone,
I am using the 'Real Time PID Control VI' for a control application with a
PCI - NI 6035E card.
I have made some modifications to the VI in order to customize it for the
needs of my application. I added another analog input channel and a 'save to
file' possibility. I also made some small modifications to the 'simple PID'
subVI ( added initialization for the error and error sum ). Making these
modifications, I noticed that when calculating time difference ( dt ), there
is a condition that dt should be greater than 0.04 sec and if not, dt is
made to be 0.04 sec. I don't really understand why there is such a
condition, since I can use much greater sampling rates than 1 / 0.04 = 25
samples per sec. I changed that condition to a much smaller nu
mber and
everything seemed to work fine. Though, having some problems with the
response of the closed loop system I started wondering if that condition is
some kind of a lower limit for the differential time space the VI can work.

Can anybody help me ?

Thank you in advance,
M.
0 Kudos
Message 1 of 5
(2,428 Views)
Hi Manos,

I suspect the limit is based on the operating system.
Win98 and the like can only give you system clock resolutions on the order of 40msec. This is smaller on NT machines.

I sugest you add code to your app to track what the loop iteration time really is. When you get down to those kind of small time intervals, you start to run to indeterminsim problems. That when LVRT should be used.

Just trying to help. Let me know what you think.

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 2 of 5
(2,428 Views)
Hi Ben,

thank you for your respond.

I do use LV in Win98 ( unfortunately LVRT is not available ) though I cannot
understand why the 'Data Remaining' indicator of the 'AI Single Scan' VI
shows that the FIFO is empty ( no data is overwritten so the VI is supposed
to keep up real time ) at a acquisition rate of 2000 S/s ( 0.5 msec ! ).
Is this possible ?

Thank you again,

Manos

"Ben" wrote in message
news:506500000005000000475D0000-1011517314000@exchange.ni.com...
> Hi Manos,
>
> I suspect the limit is based on the operating system.
> Win98 and the like can only give you system clock resolutions on the
> order of 40msec. This is smaller on NT machines.
>
> I sugest you add code to your app to track what the loop iteration
> time really is. When yo
u get down to those kind of small time
> intervals, you start to run to indeterminsim problems. That when LVRT
> should be used.
>
> Just trying to help. Let me know what you think.
>
> Ben
0 Kudos
Message 3 of 5
(2,428 Views)
Manos,

I suspect that you are referring to two different things. Your AI Single Scan is reading from the buffer if you are doing a buffered data acquisition. This means that your DAQ board is using its own internal clock to time the acquisition, and then streaming to memory the readings it takes. This has nothing to do with your loop time.

For control, you also have to be able to process all of these points, and output a control signal based on this input. That's the true control loop time.

Your AI Single Scan is reporting that the FIFO is empty, because it is probably reading the entire buffer, in which there wouldn't be anything in the buffer after the read.

Mark
0 Kudos
Message 5 of 5
(2,428 Views)
Hi Manos,

I think you can answer your own question by writting a simple VI that measures how long each iteration of a loop takes.

Create a while loop that stores the current time stamp in a shift register after it subtacts the current time from the previous (stored in a shift register) then displays it. Put a "wait ms" inside the loop with a control wired to it's input.

Run the VI and observe how the iteration time varies with the value apssed to the wait function.

What I believe you will see is that when you get down to the loop times you are looking for, the actual iteration time will vary wildly.

Try it,

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 4 of 5
(2,428 Views)