LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Timed while loop vs normal while loop in RT for DAQmx

Solved!
Go to solution

Hi,

 

I wanted to understand the benefit of time while loop in data acquisition in RT.

If I am acquiring an analog input with sampling rate of 1Ksps and reading 1000 samples per channel, I know that the loop will take 1 sec for execution. In such cases what will be the importance of timed while loop.

In the image below, what extra benefit will I get by using TWL? Being data driven, the normal while loop will anyway complete the task of acquisition in 1 sec.

Will TWL not consume more processor resources.

0 Kudos
Message 1 of 11
(3,236 Views)

If you know you want to read a fixed # of samples every iteration, use a regular While Loop (like in your pic).   A Timed Loop would be not only unnecessary but a bona fide bad idea due to timing *overconstraint*.

 

In general, it's important to choose 1 way to control loop timing rather than 2.  I've seen a lot of posters run into trouble where they'd both request 1 sec worth of samples from DAQmx Read as their 1st timing constraint and *also* add a 1000 msec wait timing in the same loop as a 2nd timing constraint.  If any given iteration was ever a little late, there'd be more than 1 sec worth of samples in the DAQmx buffer, but only 1 sec worth would get read.  This small backlog would tend to only slowly grow and never shrink, eventually erroring out the task.

    Even worse (in many cases), up until that error the app kept acting on staler and staler data, probably unknowingly.

 

(Note: there *can* be a valid use case for setting timing with a Timed Loop or a msec wait timer.  In those cases, you should call DAQmx Read with the magic # -1 which means "all available samples".   This approach leads to potentially variable packet sizes worth of DAQ data, but every packet of data will always necessarily include the most recent samples.)

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 2 of 11
(3,199 Views)

The Timed Loop has a lot of features compared to the normal While Loop. The timing features of the Timed Loop will not help you a lot in your case since the timing is done by the DAQmx Read function. The Timed Loop executes the code inside in one thread to reduce jitter and has a higher execution priority than normal While Loops so if your RT system has multiple modules running in parallel and has to acquire data without jitter, it would be better to use a Timed Loop for the data acquisition module and normal While Loops for the other modules.

Due to additional features of the Timed Loop, it has a slightly higher overhead compared to the normal While Loop.

Lucian
CLA
0 Kudos
Message 3 of 11
(3,190 Views)

I have 7 acquisition loops in my RT. And one Loop for data transfer to host. 

Do you think that using multiple timed loop will not use more resources of processor. The sampling rate is 1 KSps of loops.

0 Kudos
Message 4 of 11
(3,170 Views)
Solution
Accepted by topic author falcon98

I'm assuming that your tasks use a hw sample clock to achieve 1 kHz sampling.  I further suppose that you want your 7 tasks to start and remain in sync, both in terms of sampling and also in terms of data transfer.

 

If so, I think you'll be much better off choosing a fixed # of samples to request from DAQmx Read and then iterating with a regular While Loop.  Fixed-size data structures are generally a good idea in RT and it may be helpful over on the host side too.

 

Methods that use Timed Loops will be at risk either of timing overconstraint (as I described earlier) or variable-sized packets.  Either would affect the correlation of the data you transfer from the 7 independent tasks & loops.

 

You'll still need to take some care to make the hardware-level sampling for those 7 tasks both start and remain in sync.   That's more of a general DAQmx thing though, separate from Loop type and RT concerns.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 5 of 11
(3,157 Views)

Hi Kevin,

 

Thanks for your detailed answer. I understand that since the DAQmx itself is sufficient for handling timing of the loop then using TWL  should be avoided.

 

In my system I have 5 6259 cards and 7, 6225 cards which are acquiring 160 channels at 1 KSPs and 400 channels at 50Sps. I was wondering if I really needed LV RT based system in this case or a windows based system could have handled such data acquisition.

Since the system is already developed using LV RT I dont plan to change it but wanted to clarify for my understanding.

 

Can you provide your comments on this.

0 Kudos
Message 6 of 11
(3,124 Views)

That kind of acquisition sounds readily do-able under Windows.  Even Windows provides hardware timing determinism when you do buffered acquisition using a hardware sample clock.

 

RT becomes important when you need software timing determinism.  Examples:

 

- control loops, especially for moderately high-speed processes where Windows' timing uncertainty could cause significant perturbations

- low latency decision-making logic.  "When set of conditions X is satisfied, immediately deactivate thingy A and activate gizmo B"

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 7 of 11
(3,108 Views)

@Kevin_Price wrote:

RT becomes important when you need software timing determinism.  Examples:

- control loops, especially for moderately high-speed processes where Windows' timing uncertainty could cause significant perturbations

- low latency decision-making logic.  "When set of conditions X is satisfied, immediately deactivate thingy A and activate gizmo B"


If you don't do any of the stuff @Kevin_Price mentioned and are just doing a straight data acquisition, you may want to rethink RT. (Just opinions here, never used DAQmx on RT, only RT and FPGA.)

  1. DAQmx for RT is relatively new, I do not know if that may mean it is a bit "buggier" than the Windows version, or is it feature complete with the Windows version.
  2. Disadvantages for RT: (Haven't used in a bit, so these may have changed)
    1. Less sophisticated GUI handling if there at all.
    2. Disk sizes may be limited. I had to use an external volume that was formatted with extFAT.
    3. Need to build set of VIs, one for the host, one for the target.

To give you an idea what Windows systems can do, I have written a program that was able to stream 8 channels at 2MSa/s over a USB bus(NI 6366), stream that data to an external Thunderbolt disk (USB disk would not keep up), while the user was able to see live data snapshots, FFTs of the data, etc. (@Kevin_Price is actually developer of the program through his contributions here on the forum, so take his advice seriously.) This program was able to run continuously for days at a time on a locked down company computer. (It was not plugged into the network at the time but had AV among other intrusive services that were impossible to stop.)

 

Your data requirements for your program are not large, and your snapshot of your program looks suspiciously like a NI example program (there are similar examples for the Windows side), my advice is to rewrite in Windows as you will end up having more options, unless you need anything that @Kevin_Price previously mentioned.

 

mcduff

0 Kudos
Message 8 of 11
(3,088 Views)

Thanks @mcDuff for the reply.

 

But using multiple loops on windows will lead to slowing down of OS. And since the acquisition duration is long with critical data , windows may hang at times.

 

May be for this reason RT is suitable choice. The OS gives full resources for the program execution and does not hangs. Even if host is down data is safe in RT.

 

0 Kudos
Message 9 of 11
(3,073 Views)

@falcon98 wrote:

But using multiple loops on windows will lead to slowing down of OS. And since the acquisition duration is long with critical data , windows may hang at times.


Why would it hang with multiple loops, unless the loops are greedy, than anything can hang. My programs have a lot of loops in them, my guess is most programs have a lot of loops. (Computers are good at doing repetitive tasks.)

 


@falcon98 wrote:

May be for this reason RT is suitable choice. The OS gives full resources for the program execution and does not hangs. Even if host is down data is safe in RT.


This is not quite true. The OS does give full resources to the program, but the program can still hang. That is why there are "Watchdogs" for RT.

 

If the host is down and data is being saved to RT than true.

 

Let me assume you are running 1000 channels at 1 KSa/s, I believe that is a high end conservative number from your earlier posts. Your cards are 16 bit, so that is 2 bytes per sample, so your data acquisition rate is 1000 * 1000* 2 = 2MSa/s. That is nothing. (You do not need the full resources of a PC for that type acquisition.) I have taken data for up to a week continuously, with Windows, over a USB bus(much worse than PXI backplane), at a 32MSa/s data rate.

 

Do what you are comfortable with. RT is great for things, but if its Real Time capability is not needed, than it can be limiting. Windows has a bad reputation, but since Windows 7, it has been extremely stable. Turn off automatic updates, power saving modes, etc, and the uptime on it can be quite good. The DAQmx API is quite mature for Windows, well written, and extremely stable when programmed correctly.

 

mcduff

0 Kudos
Message 10 of 11
(3,070 Views)