LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Two Edge Separation Timeout

Hey all,

 

I've got a program using the two edge separation as a timer for relatively high speed object (~100ft/s), and it's working well, but sometimes the first sensor gets tripped by the user by accident, which throws off the next measurement, since the system is waiting for the second sensor. Are there any tricks for adding a timeout, where if the program doesn't see the second edge after a certain time, it resets? I'm not seeing anything in the properties, but I'm hoping y'all have a workaround.

 

Thank you!

0 Kudos
Message 1 of 10
(2,659 Views)

I cannot view your VI (using LV 2016), but I would recommend using some kind of state machine. So the first edge would store the current time in a shift register, and when the second edge comes you can subtract the times. You could have a timeout, but you wouldn't need it because the first edge would always reset the time. For timeout, you could use something that naturally has a timeout like an event structure or a queue.

0 Kudos
Message 2 of 10
(2,636 Views)

This can't be done with the Two Edge Separation option of the DAQmx though, correct? I would need to use the digital input and read the t0 of the waveform? My coworker said he tried this, and the time resolution wasn't high enough. We like the ticks being used by the two edge separation, because we can get down to the micro second.

0 Kudos
Message 3 of 10
(2,620 Views)

Oh I see... I didn't know you were looking for options for the DAQmx functions. If you can't find a timeout, maybe you can find a way to limit the number of samples.

0 Kudos
Message 4 of 10
(2,616 Views)

Yes I'm using DAQmx. Sorry for not making that clear.

 

I'm already limiting to just 2 samples, since my while loop is reading them much faster than the samples are being generated. However, when using Two Edge Separation, it appears 1 sample is from first edge to second edge no matter what, so if the first sensor gets tripped by accident, the hardware is recording time for a sample until the second sensor is tripped. This is waiting for the second sensor regardless of how long it takes, and is what I'd like to find a way around, but I'm not seeing an obvious way to do it.

I've saved my VI for LV16 and uploaded if you want to see what I've got.

0 Kudos
Message 5 of 10
(2,613 Views)

What device are you using?   What timing precision do you need for this measurement?

 

I'm away from LV and am also on LV 2016 like gregoryj.  There are a few ways to approach this, and they have different pros and cons.

 

 

1.  Native use of a counter and a "two edge separation" task is the most straightforward and will give you the best timing resolution (likely 10 or 12.5 nanosec).   The DAQmx Read function has a "timeout" input -- are you trying to use it but it's not working?

    If somehow the timeout doesn't work for a task doing a single measurement, you could perhaps convert to a "continous sampling" task where for sure a timeout will work.

   An unnoticed false start would result in an erroneous measurement, but it's likely to be a very large and noticeable error, like at least an order or two of magnitude.

 

2. If your device supports hw-clocked DIO, you can capture both sensor signals in a DI task.  Post-processing would allow you to weed out accidental trips of the 1st sensor.

   Timing resolution will still be rather good (likely 1.0 or 0.1 microsec), but not as good as with a counter.

 

3. If your device supports analog input, you can capture with an AI task.  Most of the post-processing is just like it'd be for DI.

   Timing resolution will likely be worse, probably in the 2-20 microsec range.

 

I guess the next most important thing to clarify is: does the app need to have enough smarts of its own that it can *automatically* reject any user-caused accidental false starts?   If so, you may need to settle for approach #2 or #3 above, depending on what you can do with counter timeouts in approach #1.

 

 

-Kevin P

 

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 6 of 10
(2,612 Views)

I started my last reply earlier in the afternoon and some more conversation happened before I finished.  I've had a look at the code now too.

 

As to the DAQ hardware, once a "false start" initiates a two-edge separation measurement, there's not a method that will reliably guarantee that you can both reject the false start *and* catch the very next true start.  The only way to reject the false start and get the hardware to re-arm for the *next* edge of the 1st sensor is to stop and restart the task.  And that might make you *miss* a real edge.

 

So it's time to make a choice:

1. Be certain of catching *every* interval with worse timing resolution.  You'll need to implement your app with a hw-clocked DI task, assuming your devices support it.

 

2. Catch both real intervals and occasional "false start" incorrect intervals with better timing resolution.  This is pretty much the app you've already got.  You just need to notice and weed out the intervals that are clearly wrong due to user error.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 7 of 10
(2,606 Views)

When you create your DAQmx channel, you can specify a maximum and minimum value. Perhaps the Read would time out if it goes way beyond the maximum value, but maybe that's wishful thinking 🙂

0 Kudos
Message 8 of 10
(2,604 Views)

Thanks Kevin.

 

#1 is what we're doing now. I'm using a low timeout on the DAQmx Read mostly to increase response time for the stop button. This timeout won't cutoff a sample though. I'm using continuous samples so that the hardware works even if the timing is in between while loops. This, as you said, creates a noticeable erroneous measurement if the first sensor is tripped by accident, but I'm trying not to rely on the technician to know what that wrong measurement means.

 

I may play around with #2 some more (we're using a 9437) and see if the timing resolution is good enough. The sensor resolution is, at best, 1000Hz, so 1 microsecond should be plenty. I've also thought of a couple physical solutions to prevent an accidental sensor trip.

 

I'm relatively new to the DAQmx (and Labview in general) and wanted to ask the experts here if there was a trick I wasn't aware of.

0 Kudos
Message 9 of 10
(2,603 Views)

FWIW, there's a little bit more advanced approach (*IF* it's supported by your cDAQ system) that can pretty much give you both the "no missed intervals" of a normal DI task and the "fine timing resolution" of a Counter task.

 

I can only give an outline for now:

 

You would need support for "digital change detection" as a hardware-clocking method.   You'd need to set up a counter task with special configuration to measure times when the "change detection event" gets asserted in hardware.  And you might even need to manually toggle a DI bit to "prime the pump" when you first start things up. 

   The end result is a pair of data sets.  DI readings taken *only* when 1 of the lines makes a transition, showing the state of all DI lines at that moment.  Counter readings representing the time when those transitions happened, with excellent timing resolution.

 

Best to try the simpler methods described earlier first though, and only come back to this more complicated method if needed.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 10 of 10
(2,596 Views)