LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

High Resolution Timestamping of External Clock Pulses

Solved!
Go to solution

I'm using an external hardware clock for analog data acquisition, and having issues with interference causing extra pulses to be registered on the clock. To help troubleshoot I'm trying to find out when those extra pulses are occurring so I can narrow down the source of the interference. The clock should normally have relatively equally spaced pulses, so recording the time each pulse occurs would let me see when the extra ones are happening.

 

I've set a program up to read an analog input voltage level from a reference source using the hardware clock (so I can use this data as a reference to the time of occurrence), but the issue is that I can't figure out how to have LavVIEW add a timestamp to the data acquisition for each sample. The waveform output of the DAQmx Read block includes a timestamp, but the dt is constant and not related to the clock timing.

 

The clock is approximately 30kHz, so I need timestamps in the microsecond range to be able to see the difference. It doesn't have to be an absolute timestamp; the time elapsed since the last pulse, the time between pulses, or the high and low time of each pulse would work.

 

The setup is:

cDAQ-9178 with 9215 Analog Input Module

LabVIEW 17 with DAQmx functions

Windows 7

 

Thanks in advance if anyone knows how to approach this.

0 Kudos
Message 1 of 7
(1,226 Views)

Sure, this is just the kind of thing counters are great for.

 

For a good head start on things, try out the shipping example "Counter - Read Pulse Width and Frequency (Continuous)".  Set it up for either period or frequency measurement and configure the input terminal to be the external clock signal you've been using for AI.

 

At first, you can just run the example side-by-side while you run your main AI app.  Once you get a handle on things, you can incorporate the counter code into your main app.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 2 of 7
(1,191 Views)

Thanks for the guidance! I was trying to use the CI Period channel before, but was missing the CI.Period.Term property node to assign the input terminal to the internal counter.

 

I also found out with counter inputs you have to use the arm start trigger node rather than a normal trigger (to synchronize with the analog input collection on the other channel).

 

Now I'm having an issue with an occasional "Error 200019 - ADC conversion attempted before the prior conversion was complete". I think this could be due to the data flow I set up overwhelming the buffer, but I'm not sure.

 

I tried a few different configurations, but can't get it to work reliably. It works fine the first time (even with a large sample size), then subsequent runs trigger the error. But if I reduce the sample size the later runs are fine. I attached a screenshot of my VI, are you able to see what I'm doing wrong (sorry I know it's a bit messy)?

0 Kudos
Message 3 of 7
(1,165 Views)
Solution
Accepted by topic author Jenga_G

In general, an actual vi (preferably saved back to a LabVIEW version a few years old) is much better than a screencap.   That said, nothing about the code jumped out as a cause for the error you saw.

 

That particular error sounds more like a hardware-level error.  As you stated in the original post, there's "interference" causing extra (apparent) clock pulses.  The error text suggests that one of these spurious pulses constitutes an (apparent) sample rate higher than the device is capable of supporting.  The datasheet for the 9215 shows a max sample rate of (at least) 100 kHz.  Some additional specs about "conversion time" hints that single channel tasks have a low enough "conversion time" (4.4 microsec) that higher rates *might* be supported.

 

What did your counter period data show for the run where the AI task threw the -200019 error?  I'd expect you'd see a period < 4.4 microsec somewhere in the data set.

 

Assuming you do, that's just the start.  It confirms the theory of spurious clock-like pulses.  But it doesn't solve anything yet.  You still need to either prevent or suppress them.  It's worth putting significant effort into *prevention* before falling back to suppression, which is more of a work-around than a solution.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 4 of 7
(1,149 Views)

Noted, I'll just upload a VI in the future rather than a screenshot.

 

That's the conclusion I've come to as well, that the VI is reporting the error when there is interference on the clock line. When it runs without an error the data returned is fine, showing consistent periods. Unfortunately the data isn't written when there is an error (and I don't think it would be able to since as you noted the sample rate is too high).

 

So my idea of sampling the clock signal is kind of a failure since the rate is so high. I thought about recording the clock signal directly in the analog input but with a 100kHz sample rate and a 30kHz signal there wouldn't be enough resolution to make out the interference spikes. Due to mechanical reasons we can't slow down the clock rate.

 

The source of the interference is an automotive spark ignition system. We're already using a spark plug with a resistor, and the spark plug wire is shielded. We are going to enclose the ignition coil in a Faraday cage too, so that should help reduce the noise. Beyond that I have some ideas for working around it in software.

0 Kudos
Message 5 of 7
(1,114 Views)

There are 3 pretty simple options for you to see the the counter period data even in the presence of a DAQ error from the AI task.

 

1. Wire the period data to front-panel indicator

2. Place a debug probe on the period data

3. Don't wire your error into the File write -- that way the write becomes unconditional.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 6 of 7
(1,103 Views)
I went with option 3, and it worked. I could see the low period (2 micro second) pulses of the clock signal in the output file, and the AI data cut off at that point. But it's enough to let me see the timing of when the false pulses are occurring. Looks like they're not all happening at the same timing though, so I have more digging to do. Thanks for your help.
0 Kudos
Message 7 of 7
(1,058 Views)