LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Gradual drift on the X-axis whilst using the DAQ.mx sub VI's

Solved!
Go to solution

Hi,

 

I am creating a program to read in data from a detector. However the waveform gradually shifts in phase, even though it has been triggered with the DAQ trigger VI.

 

The data i am acquiring is a waveform. It takes about 40 seconds for the drift to have completed 1 waveform.

 

I do not understand where this is coming from. When using an oscilloscope to view the data the waveform is stable. I have tried with a different DAQ card, which produces the same problem. 

 

Could anyone help me with this?

 

See below the setup that I am using.

 

 

Ulas_0-1620994745739.png

 

0 Kudos
Message 1 of 14
(1,307 Views)

You've given us a picture of your code, an actual VI would've been better, but the code is simple enough and nothing jumps out as unusual.

 

But you talk about a waveform that takes 40 seconds to complete and shifts in phase, yet you don't give us anything that shows the waveform you are talking about !

 

What kind of help are you expecting?

0 Kudos
Message 2 of 14
(1,281 Views)

Hi RavensFan,

 

You are right. My post is lacking in information. I have added the VI in this reply. I have also added multiple screenshots of the VI running on multiple timestamps. You can see the measurement time in the screenshots.

 

See attached:

- the VI

- 5 screenshots taken of the graph from t=0s till t=40s with 10 second incraments

0 Kudos
Message 3 of 14
(1,270 Views)

What detector are you trying to measure?

 

It looks like you have a noisy signal that somewhat resembles a sine wave.  You have approximately a full period in a graph that is 0.001 seconds longs.  So the signal seems to be approximately 1000 Hz.

 

What is your trigger signal and what does that look like?

 

If your trigger signal is as noisy as your sine wave-like signal, then perhaps the noise is causing it to trigger at slightly different times.

 

Also, looking at a 1000 Hz signal over 40 seconds is a long time.  Is there any chance that your signal is not exactly 1000 Hz but 1000.01 Hz that the real signal actually is drifting in phase relative to your trigger signal?

 

Also, your DAQ task is setup as Continuous Samples, so that means it starts once upon that first digital edge and runs continuously until you stop it.  It's not like it is re-triggering upon each and every DAQ Read.  You are getting a continuous flow of data once the task starts.

Message 4 of 14
(1,266 Views)

The detector is a vigo system HgCdTe detector (infrared photon detector).

 

The trigger signal is a square wave coming directly from the function generator that also controls the waveform itself.

 

The signal is exactly 1kHz. I can assure this due to having connected the signal on an oscilloscope and triggered it there aswell. There it was shown as a clean 1000Hz signal. 

 

I have included the setup in this message.

 

In the PNG you can see that the function generator mixes with a current, providing a triangular current ramp. This in turn is feeded to the QCL (Quantum Cascade Laser). This then goes trough a gas cell and enters the detector.

The detector sends this signal to the DAQ USB-6216 to be read by the VI

The function generator also sends a square wave which is synchronized with the triangle wave to the DAQ USB-6216 as a trigger signal.

 

0 Kudos
Message 5 of 14
(1,259 Views)

Are you SURE both the trigger signal and the signal you are acquiring are exactly 1 kHz?

 

Oscilloscopes set up for triggering  don't behave exactly the same way as a DAQ device set for Continuous Sampling and a digital trigger.  Note that I'm not an expert on higher level DAQmx functions as I tend to only need simpler DAQmx read functionality, but just passing along my understanding of how the DAQmx drivers work.

 

An o-scope set for trigger will basically trigger on each and every screen, so the signal will look fixed on that screen.  Also, it may not even show you all the data samples it could possibly capture.  If your raw signal is a little faster or a little slower than 1 kHZ, that extra data may show up at then of the screen, and other data may be discarded until the next trigger.

 

Instead of doing Continuous Sampling, try Finite Sampling.  See if your screenshots look different.

 

Also, go back to Continuous Sampling, but collect a longer interval of time, rather than 400 samples which is 0.001 seconds, try collecting 8 or 9 seconds worth of samples.  Run an FFT on that .  You should see a peak at around 1 kHz, but you may find it is slightly off.

 

Message 6 of 14
(1,247 Views)

I suspect this is just a typical case of clock tolerances between your external function generator's notion of time and your DAQ device's notion.

 

A fairly typical clock spec on many of NI's DAQ devices is accuracy to about 50 parts per million.  Your function generator will have some kind of spec too, can't say what it might be.   50 ppm is not bad, but timing error *will* accumulate in a continuous task.

 

In your case with 400k samples/sec, you're prone to as much as 50 samples of "drift" per 2.5 seconds.  I full waveform cycle is 400 samples, so you could be prone to drift 1 full cycle every 20 seconds.

 

You observe only half that allowed drift rate, so I think clock disagreement is the likely cause.  (Remember, the other equipment has a spec too, both sides have some degree of inaccuracy, and your testing only reveals how well they agree with one another.)

 

The suggestion from RavensFan to switch to finite sampling and retriggering will prevent this clock disagreement from *accumulating*, effectively eliminating any drifting effect over the 1 msec capture window you're looking at.  

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 7 of 14
(1,223 Views)
Solution
Accepted by Ulas

A couple comments. First, you shouldn't need the buffer allocation VI. DAQmx should handle that for you.

 

Second, previous comments are right- the triggering is NOT the same as an oscilloscope. Not even close. An oscilloscope will trigger once for every screen, and will maintain phase indefinitely. DAQmx will trigger ONCE to start the data acquisition, and will maintain phase ONLY with the first trigger sample. Your drift is very, very small. Your peak is at about 0.00025 at t=0, and moves to 0 at t=30 sec. A drift of 0.25 ms over 30 seconds is not out of the realm of possible, and your oscilloscope would not be able to capture that (without a very large buffer, which you may have but I somewhat doubt).

 

Your signal is nominally 1000 Hz, or a 1 ms period, and over 30 seconds you should see 30*1000 = 30,000 periods. You actually see 30,000 periods over (30-0.00025 = 29.99975) seconds, for a frequency of 30000/29.99975 = 1000.0083 Hz. Your error is 1000/1000.0083, or 0.0008%. Another way to see it is 8 parts out of 1 million. You didn't mention which DAQ you have, but you're sampling at 400 kS/sec, so it's at least a decent grade card. Picking a random X series card, I see that the clock is accurate to 50 PPM of its base of 100 MHz. You can do the math to convert here, but this is a VERY small deviation.

 

Edit: I just saw in your picture you're using the 6216, which also has a 50ppm base clock resolution.

 

The issue is that, once your DAQ triggers one single time, it will keep detecting data at a constant rate of 400 kHz (+/- 50 ppm) indefinitely. Any teeny tiny phase issues will add up over time.

 

When you use an oscilloscope, you trigger on EVERY pulse. Instead of accumulating error over 30-40 seconds, you accumulate error over 1 ms, which is negligible. You reset your sample every time you trigger on the oscilloscope. Similarly, your oscilloscope clock will not be accurate enough to measure 1000.0083 Hz. I know you measured it on the screen to say 1000 Hz, but it just can't measure that over the duration of a few sample periods. If you're using a 100 MHz oscilloscope, a 1000 Hz signal will count 100,000 ticks. Your max resolution is 1 tick, so let's say you actually measure 99,999 ticks. That works out to 100,000,000/99,999 = 1000.01 Hz. That's more than the DAQ was able to measure, which is 1000.0083.

 

You're dealing with SUCH small time variations that it will only be visible over (relatively speaking) VERY long periods. Another way to look at this is that you can only detect this change over 30 seconds, or 12 million samples.

 

 

 

So where does that leave you? Well, it depends on what you're actually trying to do. In a system like this, you really need a master shared clock, not just a shared start trigger, or you'll see a lot of drift. If it were me, I'd just use the 6216 as your function generator. Set up a counter output task to run at 1 kHz and use it as your start trigger. Since both use the same base clock, you'll see no drift. Another way to do it would be to set up a retriggerable input task. This would let you operate identically to your oscilloscope. I don't think M series devices (the 62xx series) support retriggerable AI, so you'd need to set up a retriggerable counter task as the sample clock. For a newbie that's pretty tricky, so I'd just fall back to using the card as the function generator itself.

 

Message 8 of 14
(1,219 Views)

Hi thank you for your answer and explanation. It made the subject a lot clearer.

 

I have implemented the method of retriggering. I have done this by putting a while loop around the "preamble" of the DAQ.mx tasks. See the png/vi file for more information. However doing this has significantly reduced the speed in which labview updates the data. 

 

I would like to ask if there are ways in which i can reduce time it takes to complete X amount of averaging iterations. 

 

I have tried triggering just the trigger.mx VI however this wouldnt work.

 

Download All
0 Kudos
Message 9 of 14
(1,114 Views)

Hi Ulas,

 


@Ulas wrote:

See the png/vi file for more information. However doing this has significantly reduced the speed in which labview updates the data. 


Well, that VI is overly complicated…

 

Why do you need two case structures handling the same condition?

Why are there so many frontpanel elements without a label? NEVER delete the labels, just hide them in the frontpanel!

There are lots of Rube-Goldberg constructs: you really should remove them…

 

Creating and deleting DAQmx tasks in loops will take some time.

Writing/reading files in loops will take some time…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 10 of 14
(1,109 Views)