Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Better without sampleclock?!

Hi there!

I'm trying to program a M-6221 card. Because i want to have some more analog inputs, i'm trying to install a 8:1 analog multiplexer (MPC508).

At first i thought, my hardware wasn't good enough. I wasn't able to scan my channels faster than 1kHz. I had the problem, that the multiplexer did not switch fast enough between the channels (or didn't switch at all).

But when i disabled my software setting of the sampleclock, suddenly all worked well. Am i a bit stupid at the moment? Or can anyone explan this phenomen to me? I'm writing in VC++.

As soon as i enable the line with configuring the sampleclock, the multiplexer switches badly and my measuring results are bad.

The writeIntToBinaryArray just writes number in binary array form.
int writeIntToBinaryArray(uInt8 array[], int lengthofarray, int number, bool beginwithindexzero)

Here is a code excerpt:

[quote]
    //create the task
    DAQmxCreateTask("InputTask", &InputHandle);
    DAQmxCreateTask("OutputTask", &OutputHandle);

    //create the AI channel
    for (i=0;i<channelcount;i++)
    {
        devicename.Format("/dev1/ai%i", 0*i);
        DAQmxCreateAIVoltageChan(InputHandle, devicename, "", DAQmx_Val_RSE, -10.0, 10.0, DAQmx_Val_Volts, NULL);
    }
       
    DAQmxCreateDOChan(OutputHandle, "dev1/port0/line0:2", "", DAQmx_Val_ChanForAllLines);
   
    //configure sampleclock
    //DAQmxCfgSampClkTiming(InputHandle, "", samplerate, DAQmx_Val_Rising, DAQmx_Val_HWTimedSinglePoint, buffersize);
    //DAQmxCfgSampClkTiming(OutputHandle, "", samplerate, DAQmx_Val_Rising, DAQmx_Val_HWTimedSinglePoint, buffersize);

    DAQmxStartTask(InputHandle);
    DAQmxStartTask(OutputHandle);

    while(!_kbhit())
    {
        writeIntToBinaryArray(digitalarray, 3, multiplexin, true);
        DAQmxWriteDigitalLines (OutputHandle, 1, 1, 10.0, DAQmx_Val_GroupByChannel, digitalarray, &digsampswritten, NULL);
        while (digsampswritten<1);
        DAQmxReadAnalogF64(InputHandle, sampstoread, -1, DAQmx_Val_GroupByChannel, analoginputdata, buffersize, &samplecounter, NULL);       
        while (samplecounter<sampstoread);
        .
        .
        //do something
    }
0 Kudos
Message 1 of 5
(3,819 Views)
So for a better explanation what i'm trying to do:

I want to multiplex 2 of my 16 AI channels. And i'm trying with no luck to syncronize the digital output with the analog input.
I'm trying to scan every of my 16 AI channels. And every time i'm ready with reading i want to toggle 3 digital output one step forward.

Unfortunately, i can't get it to run with a deterministic sampling rate. Can anyone help me out?
0 Kudos
Message 2 of 5
(3,795 Views)

Hi Henrik,
1. You can configure the "Convert Clock" on the M-Series device directly.  If you set the convert clock rate, you will configure how fast the M-Series device switches between channels in the multiplexer.  For most algorithm, the default convert clock algorithm works fine.  I am surprised it does not work for you.  DAQmx by default tries to pick the minimum convert clock rate that will provide enough settling time for all channels. 
 
Note: DAQmx was overly conservative with the convert clock in versions 7.0-7.3.  This would sometimes result in the convert clock rate being slower than necessary.  We changed the algorithm for the default convert clock in DAQmx 7.4 to speed it up.  What version of NI-DAQmx are you using?  Perhaps you are encountering this issue.
 
2. If you want to synchronize AI and Digital, you may want to select the Sample Clock Source on your digital task and explicitly synchronize it to your digital task.
 
3. The Microsoft Windows XP OS and Linux have a lot of jitter so the OS itself can sometimes cause big delays which will prevent your control loop from keeping up.  Neither OS is a real-time OS.  The jitter depends on a lot of factors, including what other programs you are running.  In Windows, you can sometimes reduce the jitter by elevating the priority of your process.  I don't recommend using Hardware Timed Single Point on Windows on Linux.  It works very well in the LabVIEW RT environment. 
 
Hope this helps,
Jonathan


 
0 Kudos
Message 3 of 5
(3,787 Views)
Hello Jonathan!

To 1)
Thanks for your reply. I tried the setting of the convertclock by the command DAQmxSetAIConvRate I disabled the DAQmxSampClkTiming function before. . But unfortunately it didn't help me much. My Oszilloscope tells me, that my digital outputs are switched every about 4kHz. And that doesn't meet me setting of sampling rate of 5kHz respectively AIConvClock of 80kHz (16AI * 5kHz).

Now i saw in the description of the AIConvClock, that the driver waits 10µs / channel before aquiring a sample. This would allow a samplerate aof max 6250 Hz (16 channels * 10µs =160µs => 6250Hz) less the time to read the sample.

[quote]
By default, NI-DAQmx selects the maximum convert rate supported by the device, plus 10 microseconds per channel settling time. Other task settings, such as high channel counts or setting Delay, can result in a faster default convert rate.
[/quote]

Is my device not able to scan 16 channels at 5kHz / channel?

to 2)
Could u give me an example how to sync digital out and analog input?

to 3)
You're right with the RT os fact of Windows and Linux. I see, that HW timing isn't too good for my measurement? But it works fine so far. Unfortunatley i don't have a deterministic sample rate, which i need because i want to sample an audio signal of 2kHz. So i need and also want to have a buffered read. But when i config my sampleclock, the digital task doesn't switch synced to my readtask.

Message Edited by HenrikJ on 04-28-2006 05:04 AM

0 Kudos
Message 4 of 5
(3,768 Views)

1) You should be able to scan 16 channels at 5kHz.  That is just 80kHz total, and that is well within spec for the device.  You could double check by reading back the Sample Rate property to see if the coerced value is what you set it to. 
My guess is that the problem is that the software is not keeping up, so you are getting updates at only 4kHz. 
 
2) To synchronize buffered analog and correlated digital output, you should set your digital task's Sample Clock Source to the Analog Sample Clock terminal. 
 
Hope this helps,
Jonathan
 
0 Kudos
Message 5 of 5
(3,755 Views)