LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

AI Read Returning Empty Sets

Solved!
Go to solution

All,

 

I have the attached code in one loop of a VI.  I am running this code approximately once per millisecond during short (~30s) tests.  It usually works fine, but sometimes the AI read funtion returns an empty array.  When it does this, "AvailSampPerChan" returns 0.   Why does it do this?  Shouldn't it be waiting for available samples before reading?  Is there a simple way to make it stop doing this and wait for real samples to become available?

 

It is not returning an active error when this happens.

 

System Specs:

 

USB-6009 DAQ reading 8 AI lines at a sample rate of 3000S/s.

 

Any help would be greatly appreciated.

 

 

 

Forbes Black
Lapsed CLAD, LV 5 - LV 2022 (Yeah, I'm that old...)
0 Kudos
Message 1 of 9
(3,711 Views)

How is your task configured?  What sample rate do you have on the task?  Is it sampling a finite number of samples or continuously?


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 9
(3,653 Views)

I have it setup to read samples continuously.  I can change the sample rate.  It seems that roughly 2000 samples per second on 8 channels gives me the lowest occurance of this phenomenon, somewhere around 2.2% or 2.3%.  As I increase (or decrease) the sample rate from there that number goes up.  My default input is to read 100000 samples per channel.

 

At this point, I am thinking that the continuous sample read "overrides" the timeout on the AI Read function.  I probably don't have that phrased correctly.


Thanks for your help.

Forbes Black
Lapsed CLAD, LV 5 - LV 2022 (Yeah, I'm that old...)
0 Kudos
Message 3 of 9
(3,646 Views)

When you setup an analog input task to run continuously, the samples per channel input of the DAQmx Timing VI doesn't tell how many samples to read.  It sets the buffer size.  In general, you should just leave that unwired when you are using continuous sampling.  I'm pretty sure the DAQmx Read is set up to default to reading all available samples (-1 for number of samples to read).

 

Are you getting any errors?  I kind of expect to see a buffer overflow error when somebody sets the buffer size and they don't read the data fast enough.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 4 of 9
(3,641 Views)

Thanks.  No, not getting any errors.  I think I may be able to solve my problem by simply eliminating the timing function.  I might not need it in this program.

Forbes Black
Lapsed CLAD, LV 5 - LV 2022 (Yeah, I'm that old...)
0 Kudos
Message 5 of 9
(3,636 Views)
Solution
Accepted by diarmaede

No, you really should have the timing VI.  Otherwise your DAQ will just sample when you ask it to in software.  If you really need the 3kS/s, it needs to be hardware timed and set to continuous samples.

 

What determines your loop rate for when you read the DAQ data?  Just thinking you may be trying to get data more often than data is actually coming in.  You say you check for data about every 1ms.  At a 3kS/s rate, you should have 3 samples available.  But that is assuming only 1 channel.  The input rate is actually Samples/Channel/Second.  So a full scan will be done in 8S/(3kS/s) = 2.7ms.  You are trying to read that data too fast.  Slow down your loop.  Or set the sample rate to 24kS/ch/s.

 

Or tell the DAQmx Read to get X samples when you try to read the data.  As I said before, the default is to just grab all available data (-1).  If there is no data, you get no data.  But if you specify a set number of points to read at the DAQmx Read, it will wait for that number of samples to come in.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 6 of 9
(3,627 Views)

I set it up to check available samples.  If there are samples available, the AI read function reads the corresponding number of samples.  Otherwise, it waits for a sample to become available.  This seems to have solved the problem, no matter what sample rate I set.  Thanks!

Forbes Black
Lapsed CLAD, LV 5 - LV 2022 (Yeah, I'm that old...)
0 Kudos
Message 7 of 9
(3,598 Views)

Thank you for this thread. I was encountering the same issue while setting up a logging study similary to what the OP posted.

 

What seems to work for me is to "stop" and "start" the tasks after each acquisition loop.

Nidish96_0-1607033783532.png

However, I would like to know if doing this might affect the speed of my code and give me results that are not exactly sampled at the frequency I set up.

 

Thanks,
Nidish

 

Edit: I previously suggested something that doesn't work for higher sampling rates that I had.

0 Kudos
Message 8 of 9
(2,580 Views)

This is a 6 1/2 year old thread that is already marked as solved.  And I don't believe what you are describing for you situation is related that closely to the original thread.

 

Stopping and starting an acquisition every loop iteration shouldn't be needed.

 

I suggest that if you are having problems with your code, you start a new thread with your question, and be sure to attach an actual VI.  A partial screenshot really doesn't show anything.

0 Kudos
Message 9 of 9
(2,570 Views)