Digital I/O

cancel
Showing results for 
Search instead for 
Did you mean: 

Slow digital output on USB-6009 using NIDAQmx

I have a USB-6009 device that I am controlling with the .Net NI-DAQmx interface.

It appears that each call to "WriteSingleSamplePort" takes exactly 1 millisecond. For those not familiar with the .Net interface, this is equivalent to writing an integer value to a digital port, thereby setting the digital output state for all pins in a DO port at the same time.

Now, I assume that the NI-DAQmx software interface is what is limiting the output speed. Is this true, and if so, can anyone provide a reference to this limitation? (In the USB-6009 spec, I can see rate limits for Analog input and output, but not digital output.)

 

Is this 1-ms limit only true for the USB-600x devices, or is it true for digital output with Ni-DAQ-mx in general?

Is there any way to remove this limitation? Is there a way to increase DO speed by e.g. writing to a single pin instead of the whole port?

 

Is this speed limitation put in place because the hardware itself can take up to 1 ms to respond? If so, why is the DO operation so slow? Even if digital output is slow, I don't like the fact that the software locks the thread for the full millisecond. I would like to be performing continuous analog input at the same time as digital output, and it appears I will only be able to do this by using a multi-threaded approach where analog input is occuring on a separate thread because the digital output locks its own thread for a full millisecond during each pin change. (Edit: I suppose I can avoid explicit multithreading by using the async BeginWriteMultiSamplePort function etc)

 

Thank you for any help or advice you can provide regarding this.

0 Kudos
Message 1 of 18
(7,152 Views)

The 6009 has static DIO.  This means that they only change when software gives an actual command to change.  It is very likely 1ms is as fast as the software can change the settings.

 

If you need faster, then you need to look for a digital output that is hardware timed.  You could then load up a digital waveform and have it output that.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 2 of 18
(7,140 Views)

Hi 

0 Kudos
Message 3 of 18
(7,125 Views)
I believe the limitation is partly the software (DAQmx) and the hardware (6009). All of the DAQ devices with software timed digital I/O are simple designs for applications that do not require accurate timing. No matter the USB speed, you are going to be subject to considerable jitter with single point writes and reads. That is a limitation of the os. NI provides a solution in the form of DAQ devices with hardware timing. You just have to pay a little more for advanced capabilities.
Message 4 of 18
(7,122 Views)

Thanks Dennis,

 

I just saw your response. You may notice I also posted a reply on another thread you had participated in called "WriteMultiSamplePort is slow".  I posted that before I saw your response here.

 

I appreciate your taking the time to review and respond to this question. You are right that the 6009 with software timing is not an adequate solution for precise timing, but that doesn't have to mean it has to be arbitrarily slowed down to 1 ms. As I described in my reply to "WriteMultiSamplePort is slow" the 1 ms delay is a precise delay--not caused by the operating system or the speed of the application code, but rather by the driver or perhaps the hardware.

 

Let me explain what I mean about the difference between precise timing and fast output. You are right that I could not use the 6009 to deliver exactly 1 pulse every 10 microseconds over an indefinite amount of time, because the operating system's thread management would cause that to be interrupted periodically. However if I needed to write a program that would deliver approximately 100,000 pulses in about 1 second, that would be easily achievable on a modern PC. Sure there would be some time devoted to other threads, and it might take 1.1 seconds to deliver all 100,000 pulses, but it would still happen quickly.

 

But due to this seemingly arbitrary (but precise) 1 ms delay, it takes 100 seconds to deliver 100,000 pulses -- regardless of how fast the processor is.

 

Now you might be right that NI doesn't intend these devices to be able to perform faster that one digital output per millisecond, but if that is the case WHERE IS THIS STATED? I can't find any specification or reference the discusses this limit. The analog input on this device is limited to 48kS/s. But what about DIGITAL IO? 

0 Kudos
Message 5 of 18
(7,116 Views)
The analog input is hardware timed. The analog output is also software timed and that has a maximum of 150 samples/sec. I don't believe the 1000 samples/sec you are seeing is fixed in any way. Could NI design a board and driver with lower overhead? Possibly but I'm not sure of the motivation when there are existing solutions.
0 Kudos
Message 6 of 18
(7,103 Views)

@Dennis_Knutson wrote:
 I don't believe the 1000 samples/sec you are seeing is fixed in any way. 


Perhaps I explained it better in my reply to "WriteMultiSamplePort is slow".  The 1000 S/s rate is a precise, FIXED delay. It is NOT related to the processor speed or communication speed. It is not approximate, nor does it change with CPU load etc.  The windows operating system provides high-resolution timers that prove the call to the NIDAQmx driver takes exactly 1000 micro-seconds (within an error of about 10 microseconds). The only way that is possible is if the driver itself (perhaps in response to a signal from the device) uses a high-resolution spin-wait function or a tight loop to enforce the 1 ms delay.

 


@Dennis_Knutson wrote:
 Could NI design a board and driver with lower overhead? Possibly but I'm not sure of the motivation when there are existing solutions.

This has nothing to do with overhead.  This delay is not caused by any slowdown related to processing speed, and therefore has nothing to do with overhead in the driver. However the delay could be (and probably is) imposed deliberately by the software due to limitations in the hardware. But if this is a limitation of the hardware, that limitation should be included in the specifications.  

Of course NI might not have a motivation to increase the digital output speed of this device when faster devices are available, I agree with that. But NI should nonetheless DOCUMENT the hardware limitations of this device. 

Saying that a digital port is software timed implies that it can be changed as quickly as software can issue the command to change it. But the behavior I am describing is MUCH MUCH slower than software timing. 

 

I really appreciate your help and interest, Dennis but you have not answered my most basic question, which is where can I find offical documentation regarding the maximum digital output speed of the hardware and/or nidaqmx driver? The software can issue commands to the device at speeds of well over 100,000 commands per second (as I have already demonstrated) but I'll reiterate that there is a FIXED, PRECISE, and most likely DELIBERATE delay of 1 ms in the NiDAQmx function call that writes digital output.  If you require additional proof of this, I will have to attach a retrospective symbolic debugger and/or disassembler to the process, which should be able to locate the specific timing function in the driver that is causing the delay. 

 

 

0 Kudos
Message 7 of 18
(7,097 Views)
There IS no official document. The official document states that there is only software timing with no min or max associated with it. If you can prove that DAQmx has a fixed delay in it, provide it.

I don't understand your statement about writing 100000 commands a second and then say you can only write 1000. How are you measuring this high rate to the 6009?
0 Kudos
Message 8 of 18
(7,089 Views)

I commented here.

 

 

Best Regards,

John Passiak
0 Kudos
Message 9 of 18
(7,082 Views)

Hi all,

 

I am facing a similar issue with the DO of USB-6009. In my application, a channel of DO just has to switch to high for a specified length of time with 1ms resolution and switch back to low after that, so that the limitation of 1ms is not really an issue for me. What makes me headache is the overhead of switch length and jitter, since I am switching to high and low in terms of writing the corresponding single boolean value to the channel each time.

 

I've tried also to write a waveform that contains as many highs as the specified length of time and at the end one low, hoping that it could work. But what I then have seen on oscilloscope was that USB-6009 switches only for the 1st ms to high and after that it goes back and remains on low, no matter how many highs I generate in the digital wave form. As comparison, I've done the same test on a USB-6341 and it has done everything as expected. For me, this phenomenon could relate to the lack of hard timer of USB-6009. When I, however, tried to generate a waveform of type ramp using the digital pattern generator vi and write it to the DO channel, I could see on oscilloscope that the signal toggles between high and low with exactly 1ms period and without any jitter or overhead. I am wondering, how this could be realized and made used of to solve my problem, but I haven't been able to work it out yet. Does anyone have an idea about it?

 

I've called the technical support of National Instruments and they've recommended the USB-621x from the higher M-series. Could anyone tell me if this module is capable of solving my issue? Thanks!

0 Kudos
Message 10 of 18
(5,955 Views)