04-10-2014 01:02 PM - edited 04-10-2014 01:07 PM
I have a USB-6009 device that I am controlling with the .Net NI-DAQmx interface.
It appears that each call to "WriteSingleSamplePort" takes exactly 1 millisecond. For those not familiar with the .Net interface, this is equivalent to writing an integer value to a digital port, thereby setting the digital output state for all pins in a DO port at the same time.
Now, I assume that the NI-DAQmx software interface is what is limiting the output speed. Is this true, and if so, can anyone provide a reference to this limitation? (In the USB-6009 spec, I can see rate limits for Analog input and output, but not digital output.)
Is this 1-ms limit only true for the USB-600x devices, or is it true for digital output with Ni-DAQ-mx in general?
Is there any way to remove this limitation? Is there a way to increase DO speed by e.g. writing to a single pin instead of the whole port?
Is this speed limitation put in place because the hardware itself can take up to 1 ms to respond? If so, why is the DO operation so slow? Even if digital output is slow, I don't like the fact that the software locks the thread for the full millisecond. I would like to be performing continuous analog input at the same time as digital output, and it appears I will only be able to do this by using a multi-threaded approach where analog input is occuring on a separate thread because the digital output locks its own thread for a full millisecond during each pin change. (Edit: I suppose I can avoid explicit multithreading by using the async BeginWriteMultiSamplePort function etc)
Thank you for any help or advice you can provide regarding this.
04-10-2014 04:05 PM
The 6009 has static DIO. This means that they only change when software gives an actual command to change. It is very likely 1ms is as fast as the software can change the settings.
If you need faster, then you need to look for a digital output that is hardware timed. You could then load up a digital waveform and have it output that.
04-11-2014 09:08 AM
Hi crossrulz, thanks for your reply. However the problem is not related to your answer. You are indeed correct in your understanding of how the digital output lines are controlled. They are described in the device specification as being "Software Timed" meaning that the output can only be chaged by the usb host (e.g. a PC).
However it is quite wrong to think that 1 ms is the fastest that software can issue commands. That might have been true in the '90s when processors ran at tens or hundreds of MHz. But on a modern (e.g. 1.6+ GHz) PC, I have done a few simple benchmarks that show that this setup can make around 3700 calls in ONE millisecond to a function that reads voltage from an analog input. That's 3.7 MS/s!!
Now the USB-6009 hardware is only capable of performing AI at 48 kS/s, so reading the voltage at 3.7 MS/s is overkill and results in duplicated data, but the point is that the 1 millisecond rate limit IS NOT due to the speed at which the software runs or issues commands to the device. Both the execution of software commands and the USB communication operate at speeds far in excess of 1MS/s, so there is no reason from the PC side of things why digitial IO could not occur at 1 microsecond or faster. The 1 millisecond rate limit is imposed by the NI-DAQ driver!!
Does anyone know WHERE can I find a reference for this limitation and WHY is it imposed? Is there any way around it?
Thank you very much for taking the time to read this!!
04-11-2014 09:37 AM
04-11-2014 11:13 AM
Thanks Dennis,
I just saw your response. You may notice I also posted a reply on another thread you had participated in called "WriteMultiSamplePort is slow". I posted that before I saw your response here.
I appreciate your taking the time to review and respond to this question. You are right that the 6009 with software timing is not an adequate solution for precise timing, but that doesn't have to mean it has to be arbitrarily slowed down to 1 ms. As I described in my reply to "WriteMultiSamplePort is slow" the 1 ms delay is a precise delay--not caused by the operating system or the speed of the application code, but rather by the driver or perhaps the hardware.
Let me explain what I mean about the difference between precise timing and fast output. You are right that I could not use the 6009 to deliver exactly 1 pulse every 10 microseconds over an indefinite amount of time, because the operating system's thread management would cause that to be interrupted periodically. However if I needed to write a program that would deliver approximately 100,000 pulses in about 1 second, that would be easily achievable on a modern PC. Sure there would be some time devoted to other threads, and it might take 1.1 seconds to deliver all 100,000 pulses, but it would still happen quickly.
But due to this seemingly arbitrary (but precise) 1 ms delay, it takes 100 seconds to deliver 100,000 pulses -- regardless of how fast the processor is.
Now you might be right that NI doesn't intend these devices to be able to perform faster that one digital output per millisecond, but if that is the case WHERE IS THIS STATED? I can't find any specification or reference the discusses this limit. The analog input on this device is limited to 48kS/s. But what about DIGITAL IO?
04-11-2014 11:55 AM
04-11-2014 01:34 PM
@Dennis_Knutson wrote:
I don't believe the 1000 samples/sec you are seeing is fixed in any way.
Perhaps I explained it better in my reply to "WriteMultiSamplePort is slow". The 1000 S/s rate is a precise, FIXED delay. It is NOT related to the processor speed or communication speed. It is not approximate, nor does it change with CPU load etc. The windows operating system provides high-resolution timers that prove the call to the NIDAQmx driver takes exactly 1000 micro-seconds (within an error of about 10 microseconds). The only way that is possible is if the driver itself (perhaps in response to a signal from the device) uses a high-resolution spin-wait function or a tight loop to enforce the 1 ms delay.
@Dennis_Knutson wrote:
Could NI design a board and driver with lower overhead? Possibly but I'm not sure of the motivation when there are existing solutions.
This has nothing to do with overhead. This delay is not caused by any slowdown related to processing speed, and therefore has nothing to do with overhead in the driver. However the delay could be (and probably is) imposed deliberately by the software due to limitations in the hardware. But if this is a limitation of the hardware, that limitation should be included in the specifications.
Of course NI might not have a motivation to increase the digital output speed of this device when faster devices are available, I agree with that. But NI should nonetheless DOCUMENT the hardware limitations of this device.
Saying that a digital port is software timed implies that it can be changed as quickly as software can issue the command to change it. But the behavior I am describing is MUCH MUCH slower than software timing.
I really appreciate your help and interest, Dennis but you have not answered my most basic question, which is where can I find offical documentation regarding the maximum digital output speed of the hardware and/or nidaqmx driver? The software can issue commands to the device at speeds of well over 100,000 commands per second (as I have already demonstrated) but I'll reiterate that there is a FIXED, PRECISE, and most likely DELIBERATE delay of 1 ms in the NiDAQmx function call that writes digital output. If you require additional proof of this, I will have to attach a retrospective symbolic debugger and/or disassembler to the process, which should be able to locate the specific timing function in the driver that is causing the delay.
04-11-2014 02:11 PM
04-11-2014 02:43 PM
03-10-2015 07:52 AM
Hi all,
I am facing a similar issue with the DO of USB-6009. In my application, a channel of DO just has to switch to high for a specified length of time with 1ms resolution and switch back to low after that, so that the limitation of 1ms is not really an issue for me. What makes me headache is the overhead of switch length and jitter, since I am switching to high and low in terms of writing the corresponding single boolean value to the channel each time.
I've tried also to write a waveform that contains as many highs as the specified length of time and at the end one low, hoping that it could work. But what I then have seen on oscilloscope was that USB-6009 switches only for the 1st ms to high and after that it goes back and remains on low, no matter how many highs I generate in the digital wave form. As comparison, I've done the same test on a USB-6341 and it has done everything as expected. For me, this phenomenon could relate to the lack of hard timer of USB-6009. When I, however, tried to generate a waveform of type ramp using the digital pattern generator vi and write it to the DO channel, I could see on oscilloscope that the signal toggles between high and low with exactly 1ms period and without any jitter or overhead. I am wondering, how this could be realized and made used of to solve my problem, but I haven't been able to work it out yet. Does anyone have an idea about it?
I've called the technical support of National Instruments and they've recommended the USB-621x from the higher M-series. Could anyone tell me if this module is capable of solving my issue? Thanks!