01-30-2007 09:03 AM
01-30-2007 09:15 AM
LabVIEW itself has no support for such a high resolution on desktop OS. The OS itself does not gurantee that the application will have the ability to do this.
To get to such resolution you have to use LV real-time and run the VI on a real-time OS, where you should be able to have an accuracy of up to a microsecond (I think).
If you say you can do this in C, you could try writing a function in C and calling it, but I'm not sure the overhead of calling the DLL will not be too large.
01-30-2007 10:20 AM
Ditto what tst already said, but there are some other possibilities IF you have an NI DAQ board. It kinda sounds like you don't...
For anyone else tuning in, one could either:
1. Create a buffered input task with a high sampling rate and then use the DAQmx Read function to pace the loop. Say you sample at 25 kHz and read 5 samples at a time. Your loop will then (try to) run at an average rate of 5 kHz. The OS won't guarantee that you'll always achieve that rate consistently, but this method will tend to catch up for lost cycles.
2. Use a 'Timed Loop' structure where the Timing Source comes from the DAQ board. Timed Loops run at a very high priority and give you better options about how to detect slow loop cycles and what to do about them. This method would probably give you more reliable timing due to the high execution priority, but still no guarantees due to the OS.
-Kevin P.