LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to accelerate the sampling rate

I use HW timed single point mode to do some operations.
I want to achieve a sampling rate of 5k. But now it is only 1k and error-200714 will occur.
the error-200714 told me: Reduce your sample clock rate, the number of channels in the task, or the number of programs your computer is executing concurrently.
Is there something wrong with my program?
Is there a way to make the sampling rate faster?

Thanks!block.jpgerror.jpg

0 Kudos
Message 1 of 6
(1,161 Views)

Hi Max,

 


@MaxPan wrote:

I use HW timed single point mode to do some operations.
I want to achieve a sampling rate of 5k. But now it is only 1k and error-200714 will occur.


On a default Windows computer it will be hard to impossible to reach a sample rate of 5kS/s by reading single samples!

And when you want 5kS/s sample rate then you should not set it to just 1 kS/s…

 


@MaxPan wrote:

Is there something wrong with my program?
Is there a way to make the sampling rate faster?


It's wrong to read just one sample after the other for such sample rates.

You can get higher sample rates once you read multiple samples per DAQmxRead call…

(Did you examine the DAQmx example VIs coming with LabVIEW?)

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 6
(1,135 Views)

I have reference DAQmx example.
but the error-200714 still happens. block.jpgerror.jpg

0 Kudos
Message 3 of 6
(1,098 Views)

Hi Max,

 

did you read the error explanation?

Did you try some lower sample rates?

 

Which kind of hardware do you use?

What happens when you use "1 sample on demand" software timing?

What's the reason to choose "hw timed single point" mode?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 4 of 6
(1,094 Views)

If you're running in Highlight mode you'll fill the buffer and get an error asking you to slow down ...

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 5 of 6
(1,078 Views)

Very general thoughts & suggestions:

 

1. "Hardware Timed Single Point" is almost never the right choice when running under Windows.  It's probably only worth considering under RT.

 

2. It looks like you're trying to run a control loop at 5 kHz.  What's the process you're trying to control?  Most commonplace things don't have the inherent response bandwidth to *need* 5 kHz control.

 

3. You could perform AI *sampling* at >5 kHz, but you shouldn't expect to *iterate* your control loop at 5 kHz.  Higher-speed AI sampling is done with a hw-clocked buffered task.  You request a block of samples at a time (i.e., you don't *iterate* at the same rate you *sample*) and this adds *latency*.   The first data in the block represents what was going on a little while earlier.

 

4. This will be where you make tradeoffs.  A hw-clocked buffered task gives you regular sample timing but more latency.  A software-timed unbuffered task gives you sample times that are subject to Windows timing irregularity, but generally lower latency.

 

5. The output task should change to on-demand software timing.  (You do this by removing the call to DAQmx Timing.)

 

6. So now your control loop will run:

- slower but at a more consistent rate if you read same-sized blocks of data from a hw-clocked buffered AI task

- faster but at a more variable rate if you read single samples from a software-timed AI task

 

 

-Kevin P

 

4. 

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 6 of 6
(1,072 Views)