LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Measuring Time Between Changes in Digital Inputs

Hello,

 

I've seen a couple similar posts (e.g., http://forums.ni.com/t5/LabVIEW/Measuring-time-between-digital-pulses/m-p/1056881/highlight/true#M46... but they're from quite a few years ago.  So, I thought I'd be better off starting a new post.  I apologize in advance if I'm duplicating anything.

 

I have an electric DC motor that when commanded rotates a fixed number of degrees between two limit switches.  When one limit switched is reached, power to the motor is removed and a status signal goes high.  When the motor is commanded to rotate to the previous state, the status of the first limit switch will go low and the status of the second will go high once the motor reaches that point.  I need to measure and record the time between the fall of one status and the rise of the second status.  The status signals are a 5 V signal wired to digital inputs on a PXIe-6366.  The time will likely be on the order of 0.5 sec, and I would like as much resolution as I can reasonably get.

 

As the block diagram image below shows, the digital inputs are constantly being monitored.  When a change is detected, the "Get Date/Time In Seconds" VI is called to get the current time.  When the state of the other status changes, the same happens.  When the status is fales, it simply passes the time value around the loop via the shift register.  Subtracting those times should give me the time it took for the motor to rotate between the two states.

 

I've tried simply looking at for a state change as well as tried using the "Boolean Crossing PtbyPt VI," which I think should serve the same purpose.  I never seem to get reasonable answers unless I incorporate a "Wait (ms)" VI in the while loop.  But, I still get sporadic results, where it gives me reasonable answers maybe 90% of the time and then occasionaly throws a curveball.  Any help would be greatly appreciated.  Also any advice in regards to controlling loop timing would be helpful.

 

Sorry the block diagram is a little messy.

 

Thanks!

 

Timing Between Changes in Digial Inputs.png

0 Kudos
Message 1 of 5
(2,992 Views)

By far the simplest way would be to use the counter on that card.  You can create a "Two Edge Seperation" task and DAQmx does all of the work for you.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 2 of 5
(2,986 Views)

Thanks so much for the quick response!  I'll try that ASAP.

 

Out of curiosity, is there anything inherently wrong with the way I tried it?  The logic seems fairly simple, so I'm perplexed as to why it doesn't seem to work.  I understand that not using a "Wait (ms)" VI will take full control of the CPU, but that should just give me better timing resolution, correct?

0 Kudos
Message 3 of 5
(2,979 Views)

gfinn503 wrote:  but that should just give me better timing resolution, correct?

You cannot really state that since Windows and Timing do not go together.  Windows can randomly take your process off of the CPU for seconds if it decides to do so for whatever reason (virus scan, update clock, internet traffic, some other unknown process).  I'm guessing this is that "curve ball" that you were referring to.  If you truely need "as accurate as possible", use the counter.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 4 of 5
(2,952 Views)

Got it, thanks again.  I'll give the counter a try.

0 Kudos
Message 5 of 5
(2,937 Views)