Hi Tom,
As you said with "On Demand " AI method I tried same thing in QNX
. I have setup the program to run at system timer inturrupt.
With this there is constant delay between analog input and analog output.
But this delay between analog input and anlog output increases if we increase sampling rate or input signal frequency.
Can you suggest any other way to improve the performance.
This
is not for PID or single-point processing loop but we want to implement
some real time application and i tried this to check performance issue
in linux as we are not able to access card from linux kernel space for data aquisition.
I will check this On Demand Metheod of AI in linux user space as well.
Thanks for yr support
Regards,
sgm