From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

When I'm going to loose some data acquisition?

Solved!
Go to solution

Hello,

I'm using a NI 9220 slot board in a cDAQ chassis. I search on the different manuals and supports but I didn't find what I want to know. It is a technical information.

I don't understand how the DAQmx (on continues mode) could lose data during the communication from the cDAQ and the PC(if is possible). I know that my 16 adc have a sample rate of 100kS/s. If all are collecting data when the data buffer on the cDAQ could lose data because the communication with the PC is too slow? Is it possible or the cDAQ stopped the acquisition until the buffer is already ready?So which is the size of the maximum memory buffer of the cDAQ ?

 

Thanks a lot, Nicola.

0 Kudos
Message 1 of 6
(1,038 Views)

First - you should not try to read faster than your communication bus can support

If you do not read fast enough, the buffer on the host will be overwritten with new data thereby you lose the overwritten data.

Santhosh
Soliton Technologies

New to the forum? Please read community guidelines and how to ask smart questions

Only two ways to appreciate someone who spent their free time to reply/answer your question - give them Kudos or mark their reply as the answer/solution.

Finding it hard to source NI hardware? Try NI Trading Post
0 Kudos
Message 2 of 6
(1,031 Views)
Solution
Accepted by topic author Nixe44

Just to add a little note:

 

The standard behavior of DAQmx will be to latch itself into an error state (that it will report to you if you look for it) when the host-side buffer is about to be overwritten.   So you won't lose data without knowing about it.   That's the main thing I wanted to emphasize -- for as along as you are able to keep retrieving data from the task, you can be sure that none has been missed.

 

(There are ways to override this behavior and *allow* overwriting without latching an error, but you have to go out of your way to configure that behavior explicitly.)

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 3 of 6
(1,016 Views)

Thanks a lot for you answer. 

 

For the first point, are there some information about the size of the waveform data? I know is an array but How could I calculate the size of all samples that i send?

 

Other point: Is there a best way to set the number of samples? My question is relatated at the DAQ settings. My sample frequency is relates to the my application and the Ni 9220 that i use. If i set a lot of number of samples I have a big size of buffer pack from the device to the PC but more seconds from different pack (My VI may be more slowly) in the other wise i have a lot of data pack form the instrument but a less size of the pack. How could I choose the best number of samples? Are there any guide for choose that?

0 Kudos
Message 4 of 6
(1,015 Views)
Solution
Accepted by topic author Nixe44

I'll answer from a LabVIEW perspective as that's what I know.

 

The fact that you aren't sure about the # samples leads me to suppose you're doing Continuous Sampling.  And it turns out that the "# samples" input for DAQmx Timing is only treated as a suggestion.  DAQmx will make a host-side task buffer that's no smaller than that # samples, but it might be considerably larger.  See this article for details.

 

Part of the take-away lesson is that for Continuous Sampling, there's generally no need to optimize for a specific exact buffer size.  Just make it "big enough", and if you choose a size that's bigger than you really needed, it won't have any impact on you.

 

I personally will generally set my buffer size somewhere in the 2-10 second range, *knowing* that I don't really need it to be that big, but liking the safety of having a big margin of error for any PC CPU-starvation hiccups due to Windows doing unexpected and unasked-for things behind my back.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 5 of 6
(1,011 Views)

Thanks a lot! I had read the article that you attach me but now is all more clear! 

0 Kudos
Message 6 of 6
(987 Views)