From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Counter/Timer

cancel
Showing results for 
Search instead for 
Did you mean: 

Synchronizing Counter Reading with Digital output pulse

Solved!
Go to solution

Hey all,

I am developing a VI in which I have to synchronize the reading of the counter with the Digital output pulse which is used for other purpose (I need to know the value of the counter after each falling edge of the Digital output pulse). My DAQ device is USB-6351, if anyone is familiar with how to get started with this situation, ANY help would be greatly appreciated. I am very new to LabView and I am currently spending all of my free time reading through manuals and help files. 

 

Please let me know if you need any sort of additional information from me to understand what I am doing.

 

0 Kudos
Message 1 of 5
(3,583 Views)
Solution
Accepted by Dileep2628

1. Start from a shipping example.  Go to the menu "Help-->Find Examples...".   Then select "Hardware Input and Output-->DAQmx-->Counter Input."   Then pick either the Finite or Continuous flavor of "Counter -- Count Edges".  (Actually, you may as well try both and learn from their differences.)

 

2. The 'Input Terminal' is the signal whose edges will increment your counter's count value.

 

3. The 'Sample Clock Source' is the signal whose edge will sample and buffer the instantaneous count value.  This should be the digital output pulse you refer to.

 

4. On the block diagram, go to the call to DAQmx Timing.  There's an unwired input called 'active edge'.  Right-click it and create a control.  Double click the control to find it on your front panel and place it around the other "Timing Settings".  Set it for falling edge polarity.

 

5. Run it!

 

Note: the 6351 is part of the very nice X-series family of multifunction boards.  They're really versatile, you can do a lot with them.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 2 of 5
(3,471 Views)

Thank you so much for the solution Kevin it worked 🙂

 

0 Kudos
Message 3 of 5
(3,107 Views)

Hey Kevin, can you tell me how to read the value of the counter every time with a time delay (for example 50ns) after each falling edge of the Digital output pulse. 

0 Kudos
Message 4 of 5
(2,254 Views)

Yes, there's a way, but are you sure it matters?  The counter circuitry is barely capable of registering any more counts within that extra 50 nanosec, probably only 1 if it's counting an external signal.  And if you're counting an internal timebase, you can just calculate the appropriate count offset for the amount of time delay.

 

But here's what you can do: configure another counter for retriggerable single pulse generation.  Set low time and initial delay to the desired delay.  Note: You'll need a minimum of 20 nanosec and can only achieve delay values at 10 nanosec increments.  High time can be similarly short if you like, but may not need to be.  You just need the pulse to complete before the next incoming falling edge you want to react to.

   So configure the trigger to be sensitive to the falling edge of the incoming digital pulse, and then use its output as the sample clock for your original counter task.  You'll also now change your original counter task to be sensitive to the *rising* edge of this new, delayed sample clock.

 

But again, you should probably think carefully about whether there will be any real benefit from such a short delay.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 5 of 5
(2,250 Views)