Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

usb-6008/6009 Analog Ouptut Speed

I ran an experiment where I set up a simple 5V, 0V, 5V, 0V ... pattern using a USB-6008.  Whether I used an analog output or digital output, it generated approximately a 1 kHz pulse pattern on my scope.  How is that possible if it only updates at 150 samples/second according to the spec sheet?  I'm wondering if there's any way to make it go faster.

George

 

0 Kudos
Message 1 of 9
(4,396 Views)

George,

 

The 6008 / 6009 are software timed outputs for AO and DO. They will update as fast as your program can run. I typically see this around the 1kHz rate, so that seems pretty normal to me. I believe the maximum update rate in the specs is 150Hz is because of the settling time on the AO, but I can't remember exactly. This might just be the fastest "guaranteed" spec that NI wants to use. Either way, if you need to go faster, then you need to look at another device. My suggestion would be based on the following:

 

1) Do you need hardware timed Digital Output and Analog Output? If so, look at something like the USB-6341, USB-63XX or a USB-622X, 625X, or 628X device. These are externally powered and will do both hardware timed AO and DO.

 

2) If you only need hardware timed AO and are fine with software timed DO, then look at the Bus Powered USB M-Series Modules (USB-621X). They are a little cheaper and only need a USB connection.

 

This knowledgebase explains the difference between the externally powered and USB powered.

 

 

Aaron W.
National Instruments
CLA, CTA and CPI
0 Kudos
Message 2 of 9
(4,394 Views)

Thanks, Aaron.  I understand that it's s/w timed, but I'm curious what's controlling that time.  Is it the USB connection?  MAX?  The 6008/6009's firmware?  It can't possibly be my desktop CPU because I see the same 1kHz with different CPU architectures & RAM.  We also have a non-NI daq device that advertises a maximum of 100 samples/second, yet I can perform the same experiment with it & get a 2kHz pulse.  I'll check with their tech-support dept. & see how they explain it.

George

 

0 Kudos
Message 3 of 9
(4,380 Views)

Hello George,

 

Also, as Aron said, the device you are using is actually software time, so I beleive the 150 S/s spec is just for reference. Theoretically, the better the PC, the faster the rate; this also depends on the processes handled by the computer, the processor / memory specs, the OS architecture (32 / 64 bit) and the resources percentage the computer allows LabVIEW to work with (again, if you are running just LabVIEW, it should be faster than if you have it running along quantious amounts of memory / processor hungry processes).

 

I deployed a quick test and the maximum analog output frequency for a square pattern was about 250 Hz (I did have a lot of processes running together). Maybe you can share more details to know how did you run your experiment? An image of your results or / and VI would be useful to see if I can actually go up to a similar value.

Camilo V.
National Instruments
0 Kudos
Message 4 of 9
(4,343 Views)

Thank you for opening the topic.

 

I looked at

https://decibel.ni.com/content/docs/DOC-34289

and tried the code from there. It claims 2000 S/s. On my i7, it run at 714 S/s. This value was fairly systematic among many iterations of the test, under somewhat different conditions.

 

I then wrote another code to generate a sine signal over a thermistor: I found the DAQ card 6009 had 1) too large an offset of some 19mV, which is also temperature-dependent, and slight non-linearity of DAC or ADC, and wanted to get rid of the offset challenges. On AI, I can see some minor glitches on the sine wave, normally one to two per period of a 5 Hz signal, but their effect, as well as the slight non-linearity are spread over the whole spectrum, so it worked. In the code, I got current time in seconds and generated sin one point at a time, using this "curren time in seconds" timebase to keep the phase over the time.

 

I thought it could be a good idea if I do not do a point at a time but a small part of the sine. However, I wonder about the output sampling rate and its dependence on system parameters.

 

I wonder if NI could provide more light on what determines the rate, as I would like to use this functionality in my application. Even if it is not possible to determine the update rate in advance, I just need to be sure the rate remains the same over time.

 

Thank you

 

Best regards

Albert

 

0 Kudos
Message 5 of 9
(4,307 Views)

Albert,

 

There is no guarantee that your rate stays consistent over time, it's software timed and therefore inherently unstable. If your antivirus kicks on during your program, or Windows does an update, then your program will likely not update as fast as it might otherwise. There is no getting around this.

 

If you NEED to keep a constant update rate for Analog Output then you will need to use another device other than the 6008/6009. Try looking at the USB-6211.

Aaron W.
National Instruments
CLA, CTA and CPI
0 Kudos
Message 6 of 9
(4,304 Views)

Dear Aaron

 

Thank you for your prompt reply.

 

In my experience, the extra "CPU load" leads to short glitches of a few samples, not big troubles. Maybe this is because the CPU has 8 cores and i did not manage to load all of them fully?

Perhaps, what i am looking for is some info on how the software update has been implemented, to try to estimate the probability of big troubles.

 

Best regards

Albert

 

 

 

0 Kudos
Message 7 of 9
(4,297 Views)

Dear Camilo, I have not had time in the midst of our project to do any additional experiments with the 6008 or 6009 (we have 1 of each).  However, I did receive a reply from the tech-support dept. of the other vendor, & they explained that their DLL controls the update rate.  (My previous experiment called their DLL in the same sort of VI I wrote for the 6008.)  So I suppose that means it's controlled by MAX.  Again, my VI was very simple, & just updates a 2-point array (5v & 0v) to generate the pulse train, or Digital Output on/off as fast as possible in a While loop.  Either way, 1kHz.

 

George

 

P.S.  I also saw something else curious with these devices, which I would put in a different thread if it were a big deal, but it's not, so I'll just mention it here.  The 10k S/s 6008 can read 2 analog inputs at 5k S/s; however, when I attempted to read 2 inputs of the 48k S/s 6009 at 24k S/s or 23k S/s, I received an error saying it was too fast for the device's buffer (sorry, I don't have the exact error message here).  I had to reduce it to 22k S/s to work.  I wonder if that's related to the same "software timing" phenomenon as the AO speed.

 

0 Kudos
Message 8 of 9
(4,284 Views)
The details on software timing is right in front of you. There is function that writes a single point. When it completes, your while loop has to iterate again and before that happen, the os will do whatever it wants. The jitter from the os could be from milliseconds to seconds. You've already been told you can't avoid this and one solution is changing your hardware. Otherwise, change to a real-time os.

MAX is not used at all during DAQ Write and the ai section is separate and does affect ao.
0 Kudos
Message 9 of 9
(4,277 Views)