1. When using "Ticks" for units, minVal will need to be >= 2. DAQmx doesn't support measurements of 0 or 1 "Tick" worth of time.
In general, the minVal and maxVal settings are mainly useful for people who are measuring in engineering units like seconds. They allow DAQmx to do the dirty work of correlating the programmer's needed measurement range with the board's available timebases and making a sensible automatic selection. For folks like you measuring in Ticks, DAQmx isn't gonna do any of the thinking for you anyway so you just have to give them plausible values.
2. Thus, maxVal merely needs to be a valid 32-bit integer.
3. Yes, you can explicitly configure to use the 100MHz timebase after creating the task, presumably with the function you mentioned "DAQmxSetCICtrTimebaseSrc". (I do all my programming in LabVIEW and don't know the syntax of the text code driver api.)
I believe that if you don't explicitly choose a timebase for a task using Ticks as units, the board will use its default timebase. I know there's a api function in LabVIEW to query for the timebase after creating a task, perhaps you have one available too?
-Kevin P
CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).