LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How can I determine how long it take the lab view circuit to execute once?

 

 

I'm working on a temperature control labview simulation that works with an external temperature device and I have my labview stimulation set up so that I can turn a external light on or off (which is in a chamber) and the temperature is measured inside that chamber and I displayed versus time in the labview x-y graph. I can internally turn the heat source (the light) on or off and see the waveform change. Now the information that is being displayed on the waveform is also being saved to a data file. And I noticed that as soon as I run the stimulation of the circuit on the first input into the labview while loop is from the data file. now the first execution of the simulation will start as soon as the data files sends an input into the while loop and the second execution of the labview loop will happen the next time the data file inputs something into the while loop of labview. Now, by using circuitry inside the while loop, how can I record the actual time of the initial input (into the while loop) along with the time of the second input into the loop so that I can determine how long it take for one execution in labview will take.

0 Kudos
Message 1 of 7
(3,004 Views)
Your description of your program is hard to follow.  In order to measure the execution time of a loop you can use the "Tick count (ms)" function from the "Time & Dialog" pallette.  The only catch here is that the resolution is in milliseconds.  Normally to measure loop time you would execute the loop many times and get the average time, thus overcoming this issue.
 
If you're just wanting to record the time in every loop then you can use that function or others from that same pallette to get the current time.
Message 2 of 7
(2,998 Views)

Hi There,

I too had the similar problem, so I used the Tick Count to measure the execution of the while loop. Just put the tick count and it's control at the first frame of the sequence and put another at the last frame of the sequence and then subtract the first tick count value from secong tick count value, this gives your loop duration in milli seconds.

-Mohan

 

Message 3 of 7
(2,966 Views)
One thing to keep in mind about the Tick Count VI. It uses the ms clock from the CPU which rolls over back to zero after it reaches an upper limit. You might end up getting very large negative values if you subtract the current value from the last value. The safer option is to use Get Date Time in Seconds, which outputs a timestamp. Don't worry, you can still subtract timestamps, and they also have ms resolution.
Jarrod S.
National Instruments
Message 4 of 7
(2,945 Views)

"

One thing to keep in mind about the Tick Count VI. It uses the ms clock from the CPU which rolls over back to zero after it reaches an upper limit. You might end up getting very large negative values if you subtract the current value from the last value.

"

Once every 39 (?) days that can happen but are you sure abou the large negative thing?

I seem to remember Rolf saying that the math takes care of itself. A quick experiment confirms that memory.

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 5 of 7
(2,931 Views)
Touche! 🙂 The U32 math does indeed take care of itself. People have always warned me about the roll over. I suppose I should have double checked it for myself.
Jarrod S.
National Instruments
0 Kudos
Message 6 of 7
(2,908 Views)

"People have always warned me about the roll over. "

That is a good note if we are comparing magnitudes.

Ben

 

BTW: Is that Point Park in Pgh?

I am in McMurray PA

Message Edited by Ben on 03-21-2006 09:40 AM

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 7 of 7
(2,903 Views)