From Friday, January 17th 11 PM CDT (January 18th 5 AM UTC) through Saturday, January 18th 11:30 AM CDT (January 18th 5:30 PM UTC), ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

NI USB-6210 synchronising sample clock with counter output

Solved!
Go to solution

Hi, 

 

I am trying to modulate a light source with a counter output and measure photodetector voltage (AI) in both on and off state continuously. I have also connected the counter output to a digital input to synchronise the AI voltage acquisition. Acquisition starts at the falling edge, which works fine. But, I see that with time sample clock and counter clocks are not in sync anymore. Any help will be highly appreciated. 

 

Code and the snippet are attached.

0 Kudos
Message 1 of 10
(984 Views)

Code attached

Download All
0 Kudos
Message 2 of 10
(983 Views)

The source input of your DAQmx Sample Clock Timing VI for AI is left unwired. It is generating its sample clock using the onboard clock by default and hence not synchronized with the counter.

 

If you want to synchronize the output pulse with AI acquisition, you can just Export the Internal Analog Input Sample Clock

 

Otherwise, refer Synchronize Analog Input and Buffered Duty Cycle/Frequency Counter Measurements

for the proper way to synchronize analog and counter tasks.

-------------------------------------------------------
Control Lead | Intelline Inc
0 Kudos
Message 3 of 10
(959 Views)

DAQmx test2.png

Thanks ZYOng. I had a look and tried to implement the idea. The ADC sampling frequency should be much higher than the light modulation frequency. So I couldn't directly use the same source. I tried to use one counter with modulation frequency and another counter with sampling frequency. But having the same issue.  

0 Kudos
Message 4 of 10
(948 Views)

I can't really understand what you're trying to sync, exactly.

 

I see you generating a continuous constant frequency pulse train to drive your LED.  And I see you running a continuous AI acquisition task to collect data while the pulse train is being generated.

 

What exact timing relationship do you want to establish between AI samples and pulses out to the LED?  For example, suppose you wanted to take 60 samples during the time the LED is on and then another 40 during the time it's off.  You could do that by configuring your counter output to use the AI Sample Clock as its timebase to use for deriving a pulse train with 60 Ticks High and 40 Ticks Low (assuming LED is on when the pulse output is High).   It could look something like this:

 

Kevin_Price_0-1715096320308.png

 

Then you would start the counter task first (before the AI task has started its sample clock) and the AI task last.   Note that I've shown the case where "Initial Delay" is set equal to "Low Ticks" so that the initial pulse has the same timing as all subsequent ones, something that is often desired.  But you may find that you want to change the "Initial Delay" value to set a particular offset from when you start AI sampling until you first turn the LED on.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 5 of 10
(899 Views)

Hi Kevin, thank you for the reply. 

I tried to implement your solution, but it is still doing the same. Here is a screenshot. Please correct me if I have misunderstood the suggestion. 

 

SayakG_0-1715272618966.png

 

Here is a cartoon of what I am looking for. I want to sync the data to be acquired starting exactly at the falling edge without sacrificing any data. Hopefully it is clearer now.

 

SayakG_1-1715272819474.png

Thanks in advance...

 

 

 

0 Kudos
Message 6 of 10
(883 Views)

Your most recent post talks about triggering the AI capture from the falling edge of the counter output.  There are several problems with such an approach.

 

1. Your timing diagram shows signs of being overconstrained.  The time needed to acquire 1 kSamp at 200 kHz consumes every nanosecond of the time between 200 Hz counter falling edges, leaving no time to rearm the trigger circuitry for the next falling edge.

 

2. Also, it ends up being a moot point anyway since the 6210 doesn't support AI retriggering (as far as I know).

 

3. There's a workaround where you'd use your counters to generate a retriggerable finite pulse train that AI could use for a sample clock, but that would consume both your counters.

 

I'm not saying there's NO path forward, but I can't think of a particularly easy one.  Perhaps you can talk through the signal and timing functionality and requirements to help indicate where there's a little room for compromise?

 

On the OTHER hand, approaching it more like I talked about in msg #5 should work out fairly easily.  Here are the changes you'd need to make to the code in the latest screencap:

 

4. REMOVE your "CO Pulse Freq" function call and replace it with the "CO Pulse Ticks" function call, along with inputs wired in.

 

5. Change the Ticks value to 500 each for High, Low, Initial Delay.  That will set the CO output freq to be 1/1000 of the AI sample clock rate.

 

6. Use dataflow to make sure this CO Pulse Ticks task is started *before* you start the AI task.   Your error wire can help make this happen.

 

There may be a few more little touch-up things, but that ought to at least get you pretty close.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 7 of 10
(874 Views)

Thank you all for your help and responses. This is what I found causing the timing issue. 

 

Origin of the problem:

This device can't take all the sampling rates. Although it accepts any value within its limits and doesn't show any error or warning, it internally changes the sampling rate. It is based on Sample clock timebase divisor which is an integer. It uses onboard clock of 20MHz and divide by an integer to get the required sampling frequency closest to the set sampling rate. For example, if I try to set 240kS/sec, internally it sets the sampling rate as 240963.8554, which is 20MHz divided by 83. We can read these values using timing property node. 

 

Solution / workaround:

I can get this effective sampling rate and set this as the sampling rate. But I have noticed with this setup, data still shifts slowly with respect to the counter. So this didn't work 100% for me.

Alternatively, I tweaked the counter set frequency using this information (with a small calculation) which works better for most of the frequencies. So in the above example, if I set the sampling rate of 240kS/sec, with 250Hz counter frequency, it recalculates the counter frequency as 251.004016064257. Which stays stable over time. 

 

This solution works for me now, but I should mention that there are still limitations to this approach. I find it still doesn't work well with all the counter frequencies. So there is a relation which has to be matched with the sampling and the counter frequency. Based on this information, if someone can help it work at any frequency or get the condition where it will work without drift, will be helpful. 

 

SayakG_0-1716547160887.png

SayakG_1-1716547213475.png

 

 

 

 

 

 

0 Kudos
Message 8 of 10
(797 Views)
Solution
Accepted by topic author SayakG

I'm still not sure exactly what you're after, but I think you're closer than you realize.  You just need to make the mental shift from floating point frequencies to integer intervals.  For example, you already mentioned the timebase divisor property used for calculating a valid sample rate.  That "divisor" is an integer value that tells you how many 20 MHz *intervals* there are in one AI sample clock interval.  Another way to look at it is that your AI sampling period is that same integer times 50 nanosec.

 

Similarly, you can define your counter output rate in terms of integer Ticks, where Ticks are just integer #'s of intervals of the device's internal master timebase.   For your M-series device, this is probably 80 MHz.  (One of my earlier responses suggested defining your counter output in terms of Ticks.)  So you can only generate counter periods that are integer multiples of 12.5 nanosec.

 

The last consideration is that you further need to make sure your total counter output period (Low Ticks + High Ticks) is a multiple of 4.  That makes sure that an integer # of counter periods will fit exactly within one AI sampling period.

 

Once you approach all these things by thinking in terms of integer values of intervals, it'll be more clear how to make things line up *exactly* rather than the almost-but-not-quite results you keep getting when trying to match floating point frequencies.  What you'll find is that the cases that didn't line up perfectly were ones where the counter period fell in between valid AI sampling periods.  When you constrain it as described above, it'll retain whatever "sync" you start with.

 

The snippet below illustrates a way to get started on this constraint.  This example makes sure the counter freq is at least as high as you request.  You can tweak the rounding choices if you want to constrain differently. 

 

 

-Kevin P

 

constrain counter freq.png

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 9 of 10
(789 Views)

Thanks Kevin. It helped me a lot to understand the device and now I have a formula which works for my setup. I also had a small mistake in the code which made the things worse. But now I am happy with the solution and I am now little more knowledgeable on DAQmx 🙂  Thank you...

0 Kudos
Message 10 of 10
(687 Views)