Counter/Timer

cancel
Showing results for 
Search instead for 
Did you mean: 

Pulse or two edge separation measurement with implicit timing in ticks: What do minVal and maxVal mean?

Solved!
Go to solution

When I do pulse measurement or two edge separation measurement, using implicit timing, DAQmx chooses (I believe) a suitable internal timebase for the measurement. My X series board (a 6320) e.g. has an internal 100 MHz timebase. I think I can enforce usage of a specific timebase (using the DAQmxSetCICtrTimebaseSrc property setter to set the counter timebase to the value "100MHzTimebase"). But the docs of the DAQmxCreateCIPulseChanTicks and the DAQmxCreateCITwoEdgeSepChan functions (the latter one called with parameter units set to DAQmx_Val_Ticks) make me pass a minVal and maxVal. These values are supposedly used to (internally) determine a suitable timebase for implicit timing. But how should I choose the specific values for minVal/maxVal? They obviously depend on the timebase, so this is kind of a chicken/egg situation. Should I simply say "1" or "0.1" or even "0"? Because I *want* the 100 MHz timebase to be used. Or can I simply call DAQmxSetCICtrTimebaseSrc after DAQmxCreateCIPulseChanTicks?

0 Kudos
Message 1 of 2
(3,753 Views)
Solution
Accepted by CKO42

1. When using "Ticks" for units, minVal will need to be >= 2.  DAQmx doesn't support measurements of 0 or 1 "Tick" worth of time.

   In general, the minVal and maxVal settings are mainly useful for people who are measuring in engineering units like seconds.  They allow DAQmx to do the dirty work of correlating the programmer's needed measurement range with the board's available timebases and making a sensible automatic selection.  For folks like you measuring in Ticks, DAQmx isn't gonna do any of the thinking for you anyway so you just have to give them plausible values.

 

2. Thus, maxVal merely needs to be a valid 32-bit integer.

 

3. Yes, you can explicitly configure to use the 100MHz timebase after creating the task, presumably with the function you mentioned "DAQmxSetCICtrTimebaseSrc".  (I do all my programming in LabVIEW and don't know the syntax of the text code driver api.)

   I believe that if you don't explicitly choose a timebase for a task using Ticks as units, the board will use its default timebase.  I know there's a api function in LabVIEW to query for the timebase after creating a task, perhaps you have one available too?   

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 2 of 2
(3,746 Views)