Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

PCIe-6320 Atuostop a task

Help,

 

I have a PCIe-6320 and I am trying to create a task that will autostop after a specified time in microseconds.  I see the card has a 10 ns resolution, but all the timing functions I see are about finding the time between analog signals.  Can anyone help me? 

 

Claude

0 Kudos
Message 1 of 7
(2,913 Views)

More info is needed.

 

10 nsec resolution would only apply to time measurements with one of the counters.  The "regular" analog and digital subsystems have a coarser resolution -- look in the specs for the maximum sample rate.

 

You would generally use a Finite Sampling task to accomplish auto-stop.  Once you make the software call to start it, the signal and sampling behavior will be timed very repeatably to fractions of a microsecond.  However, the software call itself and the ability of software to notice the end of a task, or to make decisions about signals won't have repeatability anywhere close to microseconds.

 

Please describe what you need to do in more detail.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 2 of 7
(2,903 Views)

I am creating a task with a specific frequency and duty cycle I want to have 1 ms timing between DAQmxStartTask() and DAQmxStopTask(). But Windows as ~1ms resolution for it's timers (we found the pulse was actually 1.3ms).  So when we saw the 6320 had a clock with 10ns resolution we were hoping we could tell it to turn the task on for 1ms and then turn it off.  Right now all the functions I'm seeing regarding timing it finding the time between inputs, not outputs.

 

Does this help clarify my situation? 

 

Claude

0 Kudos
Message 3 of 7
(2,881 Views)

Will this be a pulse train that lasts for 1 msec?   You can do that by generating a finite pulse train with a counter output task.  If you configured for, say, 100 samples at 100 kHz, you'd generate exactly 1 msec worth of pulses, with 10 nanosec (less actually) timing precision.  If you are generating at, say, 123.456 kHz, you won't be able to generate for exactly 1 msec because you can't generate the non-integer 123.456 pulses that would require.  You'd instead need to generate an integer # of full pulses like 123 or 124 which would then take slightly less or more than 1 msec.  It'll still be extremely repeatable though.

 

What you *can't* expect is for anything about your software to have anything close to such timing resolution or repeatability.  The data and signals can be timed precisely.  Your algorithms, decision making, and control responses cannot.  They're subject to OS timing vagueness, commonly in the single digits of msec but occasionally much more.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 4 of 7
(2,874 Views)

It sounds like a pulse train is what I want.  Right now I have a fixed duty cycle and frequency (30% and 400kHz) I've tried using DAQCfgImplicitTiming with DAQmx_Val_FiniteSamps, but it doesn't generate the number of samples (400) I specify DAQmxSetSampTimingType to DAQmx_Val_Implicit, but that doesn't seem to help.  When I send a DAQmxStartTask, DAQmxWaitUntilTaskDone(5 seconds), DAQmxStopTask, I get 2 pulses lasting 6 micro seconds and a timeout.

 

Claude

0 Kudos
Message 5 of 7
(2,855 Views)

Can you post your code?  The config you describe sounds right, but I'm guessing something's a little wrong somewhere in the code.   Though I only do DAQ with LabVIEW, I can recognize a fair amount of the text language terminology.  I can't be the final word, but I'll at least have a look.

 

I can't think of a good reason why you'd get *some* pulses, but not the right amount, and then the task wouldn't know that it's done and have to wait for its timeout.   You don't have some kind of pause trigger set up on this task, do you?

 

 

 

-Kevin P

 

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 6 of 7
(2,848 Views)

Found my problem.  After calling DAQmxCfgImplicitTiming() I was calling DAQmxSetBufOutputBufSize() which I didn't need to do because ImplicitTiming did it for me.  Not sure exactly what it was doing to me, but I finally got a error code back from DAQmxStartTask and that told me I didn't need to be setting the buffer size.

 

Thank you for all your help.

 

Claude

0 Kudos
Message 7 of 7
(2,842 Views)