From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

external clock and sample rate

Hello,

 

I need to generate pattern using counter 15 fast low/high pulses and then rest cycle that will last around 300microsecounds. I want to acquire device response on digital input. On attached image You can see the idea. Counter output that is sent to device will be used as sample clock to di channel. How to select correct sample rate? Wnen I am using external sample clock this should be not the case and I should always read the same ammount of data regardles the sample rate. The higher sample rate I specify the more data I can read and makes no sense to me...

Another problem that I might have is that I will have to delay di sampling by few micro secounds to await device response. Is there a way to do that without using other counter as sampling source?

 

 

 

 

 

0 Kudos
Message 1 of 2
(2,703 Views)

1. For your DI sample clock, try using "Ctr0InternalOutput" instead of "Ctr0SampleClock".

 

2. Assuming your external device is looking for the leading edge of the pulses you make, configure your DI task to be senstive to the trailing edge.  Maybe you can make your pulses less square shaped to allow more response time?

 

3. Your DI start trigger probably won't work, and you don't need it anyway.  As long as your DI task starts before your pulse generation task, it won't start sampling until the pulses show up.

 

4. I'd be wary of specifying the generic "OnboardClock" while defining pulses with Ticks.  If you want to use Ticks, be sure you also explicitly select a known timebase.  Or you can specify the pulses in terms of time and set units=seconds.  That will also make clear what your nominal sample rate is for your DI task.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 2 of 2
(2,674 Views)