I have problem when I call the NI USB 6008 read function in Python script.
I implement the read function in C++ and the function is verified in C++. When I call the read function in python script, the read function always outputs the initial value which is 0.0 . I use swig between python and C++.
C++ code of the reading function:
mlString task( "AITask" );
// Channel parameters
char chan = "Dev1/ai0";
float64 min = -10.0;
float64 max = 10.0;
// Timing parameters
uInt64 samplesPerChan = 1;
// Data read parameters
int32 pointsToRead = 1;
float64 timeout = 10.0;
int32 rv =0;
rv = DAQmxBaseCreateTask( task.c_str(), &handle );
rv = DAQmxBaseCreateAIVoltageChan(handle,"Dev1/ai0","",DAQmx_Val_Cfg_Default,min,max,DAQmx_Val_Volts,0);
rv = DAQmxBaseStartTask(handle);
rv = DAQmxBaseReadAnalogF64(handle,DAQmx_Val_Auto,10.0,DAQmx_Val_GroupByChannel,&data,1,&nrSamps,0);
rv = DAQmxBaseStopTask(handle);
I get correct data value by calling from C++. If I call this function from Python script. I always get 0.0 which is the initial value of data.
Is there anyone has similiar problem and would like to share your idea about how to solve it?
Officially NI does not support questions regarding Python. However, NI provides some information on how to program NI DAQmx applications using Python.
Please take a look at the following webpage.
There you will find some instructions.
I hope this will provide you with a working solution.