04-11-2007 08:21 AM
We are currently choosing a NI DAQ card capable detect a changing of the resistances in the resistor matrix (for example 16x16) in 1 ms interval. Does anyone know how much time it takes to: create the task for one channel, start the task, the stop the task for the PCI-6229 card (Labview 7.1,Windows XP, DAQmx8.5, Pentium 4 3.2Ghz)?
Does anyone have the results of the example (http://zone.ni.com/devzone/cda/epd/p/id/38) demonstrates how to benchmark the average number of available samples in a single or multiple channel analog input buffer?
04-12-2007 10:31 AM
Vpash,
I cannot give you a definitive answer on the amount of time it takes to stop, run, and start a task, as this is system dependant, and can be changed by the amount of code running, other processes running on the computer, etc. I have attached the example program, but reverted back to LabVIEW 7.1 so you can open and run it.
04-12-2007 05:19 PM
Hi David,
Thanks for the reply.
We didn’t buy PCI-6229 card (not yet). Before, we need to know its time parameters. If you have a possibility to test it in Windows XP environment, please send us the results.
The most interesting how much time it takes to: create the task for one channel, start the task, stop the task in Windows XP.
Thanks
Victor
04-13-2007 09:13 AM
Victor,
The amount of time it takes to do this is dependant on the system, along with the sample rate, number of samples, and amount of code being run, etc. Having said that, I used the LabVIEW shipping example ‘Acq&Graph Voltage-Int Clk.vi’ (slightly altered to calculate the time difference) and have posted a picture of the code and front panel. Running at a rate of 10K and acquiring 1,000 samples on a 3Ghz Pent. D, 2GB RAM computer, the typical time ranged from 108ms to 113ms.