From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Data Acquisition Idea Exchange

cancel
Showing results for 
Search instead for 
Did you mean: 
Kevin_Price

Allow "Incomplete Sample Detection" to be disabled for counters

Status: New

The term "Incomplete Sample Detection" comes from DAQmx Help.  It affects buffered time measurement tasks on X-series boards, the 661x counter/timers, and many 91xx series cDAQ chassis.  It is meant to be a feature, but it can also be a real obstacle.

 

How the feature works ideally: Suppose you want to configure a counter task to measure buffered periods of a 1-channel encoder.  You use implicit timing because the signal being measured *is* the sample clock.  The 1st "sample clock" occurs on the 1st encoder edge after task start, but the time period it measures won't represent a complete encoder interval.  Reporting this 1st sample could be misleading as it measures the arbitrary time from the software call to start the task until the next encoder edge.

   On newer hardware with the "Incomplete Sample Detection" feature, this meaningless 1st sample is discarded by DAQmx.  On older hardware, this 1st sample was returned to the app, and it was up to the app programmer to deal with it.

 

Problem 1: Now suppose I'm also using this same encoder signal as an external sample clock for an AI task that I want to sync with my period measurement task.  Since DAQmx is going to discard the counter sample that came from the 1st edge, my first 5 samples will correspond to edges 2-6.  Over on the AI task, my first 5 samples will correspond to edges 1-5.

   My efforts to sync my tasks are now thwarted because their data streams start out misaligned.  The problem and workaround I'm left with are at least as troublesome as the one that was "solved" by this feature.

 

Problem 2:  Suppose I had a system where my period measurement task also had an arm-start trigger, and I depended on a cumulative sum of periods to be my master time for the entire system.  In this case, the 1st sample is the time from the arm-start trigger to the 1st encoder edge, and it is *entirely* meaningful.  On newer hardware, DAQmx will discard it and I'll have *no way* to know my timing relative to this trigger. 

   Older boards (M-series, 660x counter/timers) could handle this situation just fine. On newer boards, I'm stuck with a much bigger problem than the one that the feature was meant to solve.

 

So can we please have a DAQmx property that allows us to turn this "feature" OFF?  I understand that it'd have to be ON by default so as not to break existing code.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
4 Comments
Kevin_Price
Proven Zealot

Having had a followup discussion with an NI engineer, there are workarounds to cover at least *many* applications, and possibly most or all.  They all derive from a low-level understanding of counter behavior being driven by digital signal edges (and occasionally levels).  There can often be an equivalency.

 

(Honestly, I was a little embarassed to have this equivalency pointed out, because it's something I've advocated myself in the past and somehow managed not to consider for this specific scenario.  I often referred to it as thinking "inside-out" about counters.  I found several such threads such as #1#2, #3, #4).

 

Anyway, the basic version of the workaround is to configure edge counting instead of period measurement.  An internal timebase (perhaps the maximum 100 MHz) would be the edge counting signal that increments the count register while the signal to be measured would be defined as the sample clock source.   The resulting measurement would be a cumulative # of timebase edges seen up until the Nth edge of the signal of interest.  A little math can turn this into cumulative timestamps or interval delta times.

 

The more advanced workaround was to also configure the edge counting task for hardware count reset.  This could behave more similarly to period measurement in that the count would reset to 0 on each "sample", but there wasn't 100% confidence whether the circuit timing would always reset the count *after* the sample was latched rather than before.

 

These options certainly help lot, but there would still be no similar workaround for semi-period measurement on X-series and other DAQ-STC3 based devices.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
chrisjbillington
Member

This has been biting me. I maintain software that can be used with used with NI DAQmx devices generally (the same software dihm was talking about in the forum post that prompted you to post this), and so far we have hard-coded various capablities of devices in order to work with multiple models. However, we're increasingly moving toward introspecting capabilities in order to support a wider range of devices without code duplication.

 

For my application, I don't need the initial incomplete sample. However, I need to know whether it is returned, so that I can have the code discard it without discarding the first legitimate sample. Simply being able to introspect whether the device has incomplete sample detection would be sufficient for this. But I don't see an NI-DAQmx API call to discover this. Even looking at the list of product categories returned by the C API, I'm not sure it's enough to narrow it down.  

 

So +1 for either being able to disable incomplete sample detection or even discovering whether it is present.

 

chrisjbillington
Member

 

 

Quoting myself from a discussion thread related to this:

 

 

As I've continued to work on making my code work on as wide a range of devices as possible, I've realised being able to turn off incomplete sample detection is really what would be ideal for my use case. As it stands, the minimum pulse time detectable by a semiperiod measurement varies by device (depending on available timebases), and so I need to now get that information and feed it back to the code that generates these pulses, to ensure they're not too short. The pulses are far between though, and so if I were doing a full period measurement, I presume it wouldn't matter how long the high time of these pulses was, as the time between rising edges would be long enough for any device to deal with.

 

But I need to do something in software immediately after the very first pulse, and thus need to do a semiperiod measurement if I don't want the whole pulse to be discarded incomplete sample detection - with incomplete sample detection at least I get the falling edge of the pulse still, which is good enough for me. But having to do a semiperiod measurement imposes device-specific limits on how short the pulse can be, which just makes my code more complex.



So being able to turn it off is really what would be optimal. I imagine this would be a fairly modest change to the DAQmx API, as I assume the discarding of samples occurs in software. Perhaps a getabble/settable channel property saying whether a given task and channel is configured with incomplete sample detection or not:

 

 

 

DAQmxGetCIIncompleteSampleDetection(TaskHandle taskHandle, const char channel[], bool32 *data)
DAQmxSetCIIncompleteSampleDetection(TaskHandle taskHandle, const char channel[], bool32 *data)
DAQmxResetCIIncompleteSampleDetection(TaskHandle taskHandle, const char channel[], bool32 *data)

 

 

The value would by default be False on devices that don't presently support incomplete sample detection, and True on those that do. One would get an error trying to set it to True on devices that don't support it - unless of course the 'support' can be added in software, in which case users could have incomplete sample detection on older devices as well if they want (but not by default, for backward compatibility). And importantly, one could set it to False on the newer devices.

Kevin_Price
Proven Zealot

I've given this more thought as a result of the referenced discussion thread.  I want to amend and clarify my previous comment related to "equivalency" or "inside-out" ways of thinking about counters.  It falls farther short of a universal workaround than I was thinking at the time I wrote the earlier comment.

 

1. Sometimes the problem is the one from the linked thread where the real issue is *knowledge* of whether this feature is active on a given device.  (Of course the ability to control it is better b/c it necessarily means knowledge is covered as well.)

 

2. Sometimes the attempt to use "equivalency" will lead to much less accurate or precise measurements due to quantization.

    Example: suppose you want to buffer period measurements of a slightly varying ~1 Hz pulse.  Most boards will use an internal timebase of at least 20 MHz as the counter "source" signal.  So increments happen at 20 MHz, count values get buffered at ~1 Hz.  The intervals are measured with a precision of 1 part in about 20 million.

   Let's see what happens when we approach it with "equivalency".  The ~1 Hz pulse causes increments, while some other pulse source causes buffering.  At least some boards will allow usage of an internal 100 kHz clock.  In this mode, every second there will be ~100000 samples buffered up.  Within that 100000 samples there will likely be just 1 count increment.   So you'll get (x) samples of count (N) and (100000-x) samples of count (N+1).  However, it's also possible that there will be either 0 or 2 count increments within that interval.  A difference of 1 count (over that particular interval of time) leads to a calculated period that's either infinite or 1/2 the true average.

    Now it's true that post-processing can be used to identify the indices where count increments occur.  And the difference in index starts to be more equivalent to the original period measurement method.  But the cost is that under equivalency, better measurement resolution demands a higher sample rate, larger buffer, and more CPU for post-processing.

 

3. As briefly mentioned before, there is no version of "equivalency" that covers semi-period measurements.  (I expect this is probably also true of the new-ish "pulse" measurements though I haven't tried it to verify.)

 

 

-Kevin P

 

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).