For the change detection function in PCIE-6509 /USB or similar, the manual indicates that:
When an input change occurs, the NI USB-6509 generates an interrupt, and the NI-DAQ driver then notifies the software.
I suppose we can consider the hardware interrupt generates super fast as it's ASIC based. But what is the possible time range that it gone through the driver and hit user's software ?(e.g a C#.NET software in low CPU load) . Is 1us level reasonable? Thanks.
This is not a definitive answer, but based on long experience with DAQmx under Windows I would tend to plan for several 10's of usec minimum. I don't think it'll be realistic to hope for single digits of usec regularly.
CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).