Digital I/O

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQmx PFIx triggered start: How to know exact trigger absolute time ?

Solved!
Go to solution

Hello,

 

I stream data out of a DAQmx card PXIe-6363.

The generation starts when my card see a digital rising edge on PFI0.

It works great (confirmed with scope).

 

My questions is: I need to know the exact time the generation started. Currently I am recording the timestamp when I start the task, but the task could wait a second or two before it see a trigger. Is there a DAQmx property that records that? I am running the "DAQmx Wait Until Done.vi" at the end of generation, maybe I should record the end of generation timestamp and count backwards? Is there a better way?

 

Thank you in advance for your ideas.

0 Kudos
Message 1 of 5
(1,013 Views)
Solution
Accepted by topic author FrankBrown

I don't know of a property that does that off hand (there could be one, I've just never needed it) but you could set up an analog acquisition task to get a waveform of data, which will record the start time. (I assume you don't need nanosecond accuracy or anything, right?)

Message 2 of 5
(1,005 Views)

I guess I answered my own question. Just tried counting backwards after DAQmx Wait Until Done.vi completes and it works.

 

 

 

0 Kudos
Message 3 of 5
(1,003 Views)

@BertMcMahan: Brilliant! I hadn't thought about doing that: I could setup another task that I would set to trigger on the internal DAQmx "StartTrigger" of my generation task, then just look at the start time property. 

0 Kudos
Message 4 of 5
(1,000 Views)

[Edit: oops, I skimmed the original post too quickly and responded as though this were about a *acquisition* task.  To amend my earlier response, suppose the acquisition I talk about is triggered by the generation task's signal "/Dev1/ao/StartTrigger" (or possibly "/Dev1/do/StartTrigger").  Then the initial t0 from *that* acquisition task should give you a good timestamp for the start of your generation task.]

 

-----------------Original reply below--------------------------------------------

 

Just expanding a little bit on the answer Bert McMahon already gave.

 

As I recall from either an article or a discussion here on ni.com many years ago, the t0 field for an AI task does not even get determined until you do your first read from the task.  It's only at that point in time when DAQmx will query the real time-of-day clock, check the # samples in its task buffer and the waveform's dt (= 1/sample_rate) value, and use them to work backwards to its best estimate of t0 for the initial sample.   Thereafter, it remembers the initial t0, and for every subsequent chunk you read it calculates the new t0 for that chunk from the initial one along with dt and the index for the first sample of that new chunk.

 

There's never a point where any kind of hardware latching happens to line up the time-of-day query with a a hardware signal like a trigger or sample clock edge.  It's a software query done under the hood by DAQmx, and you're unlikely to improve on it for any conventional AI task.

 

Exceptions are less conventional tasks such as having an external clock, where DAQmx is stuck having to believe the sample rate you claim.  Even in those kinds of cases, it's only *sometimes* worth the extra work it would take to get a better t0.

 

That's some extra background but the main point is just like Bert McMahan already said: for your task with a start trigger and internal sample clock, just believe the t0 you get from the waveform version of DAQmx Read.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 5 of 5
(978 Views)