Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

NI-6351 Synchonize AO and AI with oversampling

Hello

 

I have NI 6351  and trying to make the following application:

 

An AO channel ouputs a ramp, from 2.2V down to 0  with variable rise time and sample rate. For example it can go from max to min voltage value in 10 ms  using 1000 samples, which results a sample rate of 100kS/s.

Simultaniously I want to sample two analog Input channels with a higher sample rate. The sampling must be synchronized with AO, so that for each AO value I can get 2-4 AI values. The timing should be as follows:

 

AO outputs a value=>AIs wait a few µs for the voltage to set and read 2-4 (depending on oversampling settings) samples, AO outputs a new value and so on...

 

The whole process is triggered by an analog trigger Input (APFIO).

So far so good. My design works only if the AI sampling rate and the AO samling rate are equal. If they are different, I get glitches and overall very strange results.

 

My idea was to use one clock source (for example 20MHz Time base)  and derive different sample clocks from it by using a TimebaseDivider property node. But somehow it didnt work. 


Can someone please help me. tahnk you.

0 Kudos
Message 1 of 7
(2,986 Views)

I'm still on LV 2016 and can't look at the code.

 

The approach I'd take would look like this:

 

1. Use a DAQmx Timing property node to configure the "AI Convert Clock".  This is the multiplexing clock used to time the individual channels within one sample.  (See below).  You can set a delay value and a clock rate.

 

2. Configure your AI channel list to repeat the same channels 2-4 times.  You now have a multiplexed task where you end up doing 2-4 AI conversions per channel per sample clock.

 

3. Configure the AI task to use the AO sample clock as its own sample clock.

 

4. I'm not quite sure how all this fits with the need to start the process with an analog trigger, but if you've got that working the way you want it now, I don't think you should need to change anything.

 

 

-Kevin P

 

 AI convert clock settings.png

ALERT! LabVIEW's subscription-only policy coming to an end (finally!). Permanent license pricing remains WIP. Tread carefully.
0 Kudos
Message 2 of 7
(2,935 Views)

Hello, thank you for the reply.

 

I converted my project to an older version, so you can take a look at my chaotic draft, which works as it is by now, but without oversampling feature.

 

If I set the AI sample clock source to AO/SampleClock, the triggering doesnt work, I guess because the AO Clock only starts after triggering, and without a clock it can't trigger.

If I try to alter the AI Timing after the trigger, it gives me an error because the task is running.

I also tried to trigger on AO Task, but it still doesnt work properly....

0 Kudos
Message 3 of 7
(2,918 Views)

Somewhere amongst you, me, and your code there's some confusion.

 

I don't fully follow what you really need in terms of timing, triggering, and sync between your AO and AI tasks.  The code you've posted and the text you've written aren't quite making it clear to me. 

 

1. For each AI sample, do you need to be able to know what the generated AO signal was at that instant?   

   If so, then you need a scheme to correlate the data.  Often, this means that you want the AI and AO tasks synchronized by means of hardware timing signals.

 

2. Your use of an analog start trigger seems to be at cross purposes with this.  I suspect that you could get the data you need by running the tasks without any such trigger, and then optionally post-processing to find the same trigger conditions in the data if you need to "filter out" segments that don't meet your criteria.

 

3. To sync and correlate, I'd think that you'd want to use the calculated AO sample rate to determine the AI sample rate rather than letting both be independently manipulated on the front panel.

 

4. You define both AI and AO as finite sampling tasks with a specific # of samples when you call DAQmx Timing.vi   But then read and write a different # of samples when you call DAQmx Read.vi and DAQmx Write.vi

 

5. You'll probably want to do your DAQmx Read *after* calling DAQmx Wait Until Done.  Because you'll probably also want to read more than the 20 samples that are currently hardcoded.

 

6. I like your method for setting an initial AO value before configuring timing.  I'm gonna use that myself.  (I've been using explicit starts and stops, your way is much more compact.)

 

 

7. What's the physical meaning of the signal you generate and the ones you measure?  What's the phenomenon that you want to trigger off of?  Is it one of the signals you're measuring or is it some other process signal that you want only as a trigger but you don't need its time history?

 

8. Do you want both the AO and the AI tasks to be waiting, doing no sampling, until this trigger condition occurs?   If so, then I think I'd still point you toward my previous post in msg #2.   It's possible it may matter whether you drive the AO task with the AI sample clock or vice versa, but I kinda think it should be ok either way.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy coming to an end (finally!). Permanent license pricing remains WIP. Tread carefully.
0 Kudos
Message 4 of 7
(2,905 Views)

Hi again,

So here is my application in a nutshell:

 

I want to measure an IV curve of a solar cell during a very short (ca. 5 ms ) light pulse.

A photodiode is acting as an analog trigger signal. When the flash of light occurs, it triggers the whole process: AO outputs a voltage ramp across the solar cell and current sense resistor connected in series.  For each voltage point AI 1 samples the voltage  across the resistor (=> current through a sol.cell) and AI 2 measures an actual voltage across the solar cell.

By combining measured voltage and current, which are timely correlated, I can reconstruct IV curve.

 

My idea was to oversample analog inputs (measure voltage and current multiple times and build an average value) and increase precision.

 

The problem is, if I use different sample rates for AI and AO I get very weird results (probably because clocks are not synchronized). If I use your suggested method (setting AI Convert Rate) I get  errors, something about  too slow clock and not sufficient samples (the program just waits until timeout). It only works if the amount of desired samples for AI and AO are the same.

 

There must be a simple solution to use one clock  (AO sample clock) as a master clock and drive AIs with a synchronized clock  which is X-times faster than AO sample clock.

I made something like this many times on FPGA using VHDL, it shouldn't be that hard 🙂

 

 

 

0 Kudos
Message 5 of 7
(2,898 Views)

There's more than one way to move forward on this.   Whether these are "simple" is in the eye of the beholder.

 

1. The cop-out approach.  Don't worry about syncing the AO and AI tasks.  Instead just wire the AO signal into a 3rd AI channel and *measure* it along with the current and voltage.  

 

2. My AI Convert Clock idea.  Let's try harder.  It *can* work, we just need to configure the task with a compatible set of parameters.

 

3. Your original notion of deriving sample clocks from a common timebase.  This can also be done.  In a way, it always *is* being done because a given DAQ board has one master timebase used to derive all the other clocks.  There's not a real advantage to setting this up explicitly unless you had some kind of super-awesome high accuracy timebase signal.  (Not joking, was just in a related thread yesterday.)

 

Your wish for a way to have AI derive a sample clock from AO but to run it at some multiple of the AO frequency is not possible in literal terms.  My AI Convert Clock idea is, in a way, a kind of sneaky attempt to accomplish something similar.  But it depends on you the programmer to make sure the config parameters are compatible and within board specs.

 

It *would* be possible to have AO derive a sample clock by dividing down the AI sample clock but that would tend to mess up the phasing.   To illustrate, let's just suppose you wanted to divide by 4.  Every two AI samples, the divided-down AO sample clock would toggle its state.  So your first 2 AI samples happen *before* your first AO is generated and then your next 2 AI samples happen *after*.    You *could* make the AO task react to the *trailing* edge of the divided down clock I guess.  Then the 1st AO sample would happen after the first 4 AI samples but before the second 4.  

 

But again, I still think the AI Convert Clock method will be the easier way to go overall.  Let's work out the parameters.

 

Suppose you want each AO sample to produce 4 measurements each of current and voltage.  In DAQmx terminology, that'll mean 8 convert clock cycles per AI "sample".

 

To support up to 100 kHz AO sampling, we need to fit 8 convert clock cycles plus any programmable "delay from sample clock" time into a 10 microsec window.  Your board specs out at 1 MHz aggregate for multi-channel sampling, suggesting that convert clock cycles need to be about 1 microsec minimum.  So you've got at least 8 microsec used up for the 8 A/D conversions and your delay will need to be 2 microsec or less.

 

WAITAMINUTE!

 

Actually, as I think about your overall app, maybe you don't need to mess with the AI Convert Clock explicitly at all.  Just creating a channel list where both the current and voltage terminals are repeated multiple times may be enough.  DAQmx can figure out the convert clock stuff automatically.

 

If you just make a channel list that alternates the voltage and current measurement no more than 5 times each, you should be able to operate all the way up to 100 kHz sample rates for AO and AI running off the same shared AI sample clock.  Any 1 AI "sample" would contain up to 5 A/D conversions each of voltage and current.  You could average them, or do more sophisticated post-processing things like looking for trends from the 1st to the 5th conversion in each sample (to investigate response settling effects).

 

One other important thing you *need* to do is make sure you sequence the AO task to start *before* the AI task.  This will make sure it's already started and waiting for AI sample clock pulses before the AI task can detect the trigger condition and start generating them.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy coming to an end (finally!). Permanent license pricing remains WIP. Tread carefully.
Message 6 of 7
(2,879 Views)

Hi,

I solved the problem.

50% of the problem was in the software, 50% was infront of the PC 🙂

 

The solution was to leave AO and AI sample clock sources in peace (they are indeed synchronized by default), remove the sequence structure and use a digital trigger (which uses analog trigger terminal as a source) to start AO.

I did this solution before, but still encountered glitches in my measurements, which I unfortunatelly blamed the clocks for. But these glithces had nothing to do with clocks. It is because if I reduce AO sample clock rate the voltage steps increses, and voltage transient effects comes in place....

 

Well, at least I have learned a lot about the timing stuff now. Thank you for help.

 

 

0 Kudos
Message 7 of 7
(2,854 Views)