LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Can anyone explain me the concept behind introducing delays in loops.

Whats the indepth behind introducing delays in loop execution?How does it improve the performance of VIs?
0 Kudos
Message 1 of 4
(2,801 Views)
I know of the following reasons:
1. Delays prevent loop from tieing up computer resource. The computer can perform other task during the waiting period.
2. It improves UI manage most especially where you update the UI with large amount of data like in graph. This is because updating UI involves data copying, so the less you update, the more time the system has to perform other tasks.
0 Kudos
Message 2 of 4
(2,801 Views)
Adding to what Aderogba said,

1) Re: screen updates. THe eye can only see 30 updates per second. Updating a screen more often than that is a waste of resources.

2) When more than one operation is taking place at the same time, the delay allows LV to functions to use the CPU.

Summarized:
The delay prevents the wasting of resources.

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 3 of 4
(2,801 Views)
> Whats the indepth behind introducing delays in loop execution?How does
> it improve the performance of VIs?

Other posts gave you the reasons, but for a few more details.

If you build a loop that does something simple like polling some
Booleans and you place a Wait of 1ms in the loop, then the loop will run
about 1000 iterations per second. If you put a 10ms delay, 100 times
per second. By taking the delay out of the loop, and paying attention
to the iteration count of the loop, you will find that the loop runs
somewhere around 20,000,000 times per second. In otherwords, no delay
means to run as fast as possible.

Doing the math, the overhead for checking a couple Booleans is 50
nanoseconds. Doing that 20,000,000 times per second totally consumes

the CPU, using 1sec of CPU time per second. A one ms delay, or 1000
iterations uses 50 microseconds each second, or consumes 5/1000th of a
percent, or virtually no CPU time. A ten ms delay uses even less, yet
runs faster than the monitor updates and way faster than the human eye
or human reflexes.

Greg McKaskle
0 Kudos
Message 4 of 4
(2,801 Views)