08-05-2011 11:01 AM
Hi, I am new to Labview, so I know this probably about to sound like a novice issue, but I am trying to do some simple voltage reading using a USB 6009 card and am having some problems.
What I would like to do is read voltages from two AI channels, one sample at a time (eventually I would like the code to do something as soon as a voltage crosses a threshold, which is why I am reading samples one at a time, but I am not there yet, right now I am just writing the data to a chart). I have a multichannel task, and I am reading a 1 point sample each iteration of a while loop. I have attached the code for you to see what I am doing. 2AI_Dev2 denotes a task created using the first two AI channels on the DAQ (differential recording, -10 to 10 V each channel).
The problem is that the while loop rapidly falls behind the data acquisition. What I would like to do is sample the data at 100 Hz and have the loop execute every 10 ms (basically getting each sample as it becomes available), but the while loop will not keep up. Within the first second of acquistion, the data buffer shows 3-4 samples are available, and the number grows constantly as the program runs, reaching a few hundred after a minute.
I can not figure this out. I do have a few more pieces of information that may be of help to you.
1. The program has been built into an executable. It runs fine on my development machine (Win 7, 3.2 GHz i5,8 GB of ram). The computer I am trying to run the executable on is an XP machine (XP SP2, 3.0 GHz P4, 2 GB of ram).
2. Both computers have DAQmx 9.3 installed
3. I have read about other processes slowing down loop execution, so I reformatted the XP drive, reinstalled XP, and the NI driver bundle (DAQmx, MAX, etc). I turned off the firewall, windows updates (this computer is to be used as a data acqusition machine and will not be on any network), no antivirus is installed. Basically, I got the machine to the point where there is XP, NI software, and my program and that is it. Still the problem occurs.
4. I cut out both charts (I eventually put one back in, so you see I am only indexing one element from the array following each buffer read). That did not fix the issue.
5. I tried the DAQ device on all USB ports, I took the keyboard and mouse off, found some old PS/2 keyboard and mouse and put them on, so nothing is on the USB ports but the device (yes, it is USB 2.0) . Still no fix.
Do I need a faster machine? I would think that given my slow sampling rate (100 Hz), even this old machine could easily keep up, but it doesn't. It runs fine on my other machine, though.
Could it be a problem with the executable? I only have one VI in the build. I have built other VI's that work, so I think I am doing that correctly (well, and it runs fine on the other computer).
Sorry for such a long post, but I wanted you to have as much information as possible (and to know that I have tried a number of things before comming to you and I have run out of options).
Thank you in advance for your help
08-09-2011 11:46 AM
Software timed DAQ is usually not the best way to go.
Since you set the rate to 1000 samples per second and are reading them at no more than 100 per second, it is clear that you will never keep up. In addition writing to the chart takes time for the OS to update the display.
I suggest a two-loop system. In one loop read the data fast enough to keep up. Pass the data to the second loop via a queue. In the second loop do all the processing (your threshold test for example) and display. Since your data rate is 1000 samples per second, read about 10 samples at a time in the DAQ Read loop. You do not need a separate wait in that loop. The DAQ Read will wait until 10 samples are available. In the analysis loop dequeue the data and put it into an array in a shift register. Analyze when data is available. This loop could have a 10 ms delay or less depending on the tme your analysis takes. If no data (or insufficient number of samples) is available, do nothing unitl more data has accumulated. This is called a Producer/Consumer Design Pattern. Examples come with LV. And it has been widely discussed on the Forums.
Lynn
08-09-2011 01:07 PM
The 1000 was a typo left from just playing with the program to see how fast or slow unobtained samples grew as a function of rate. The original problem occurred when the rate was set to 100 (and those are the parameters I am working with). Sorry for any confusion.
Thank you for your input. I have been reading about this kind of construction since my post and hope to implement it soon in my code. Thank you for your time in helping me solve this problem.
08-09-2011 01:14 PM
Of course you could go single sample on demand and avoid the problem entirely. ![]()
08-09-2011 04:50 PM
This is an interesting idea. I was thinking along those lines, but was not sure I could ensure the integrity of the timing of samples. It is for that reason I was using the hardware timed samples and reading them into a buffer.
Can I use the single sample on-demand and still make sure samples are generated at the right time? Wouldn't they be timed by a Wait function or run at the speed of the acquisition loop?
Again, being new to Labview, I very much appreciate your help and suggestions.
Cheers,
08-09-2011 07:39 PM
@Pinky2011 wrote:
This is an interesting idea. I was thinking along those lines, but was not sure I could ensure the integrity of the timing of samples. It is for that reason I was using the hardware timed samples and reading them into a buffer.
Can I use the single sample on-demand and still make sure samples are generated at the right time? Wouldn't they be timed by a Wait function or run at the speed of the acquisition loop?
Again, being new to Labview, I very much appreciate your help and suggestions.
Cheers,
Put this right after the Daq read with a 0 input (it causes a thread release but, isn't too much overhead) you don't sound like you need more than mSec accuracy for the timestamp.