07-20-2022 05:03 PM - edited 07-20-2022 05:06 PM
Hello,
I stream data out of a DAQmx card PXIe-6363.
The generation starts when my card see a digital rising edge on PFI0.
It works great (confirmed with scope).
My questions is: I need to know the exact time the generation started. Currently I am recording the timestamp when I start the task, but the task could wait a second or two before it see a trigger. Is there a DAQmx property that records that? I am running the "DAQmx Wait Until Done.vi" at the end of generation, maybe I should record the end of generation timestamp and count backwards? Is there a better way?
Thank you in advance for your ideas.
Solved! Go to Solution.
07-20-2022 05:28 PM
I don't know of a property that does that off hand (there could be one, I've just never needed it) but you could set up an analog acquisition task to get a waveform of data, which will record the start time. (I assume you don't need nanosecond accuracy or anything, right?)
07-20-2022 05:37 PM
I guess I answered my own question. Just tried counting backwards after DAQmx Wait Until Done.vi completes and it works.
07-20-2022 05:40 PM
@BertMcMahan: Brilliant! I hadn't thought about doing that: I could setup another task that I would set to trigger on the internal DAQmx "StartTrigger" of my generation task, then just look at the start time property.
07-20-2022 08:04 PM - edited 07-20-2022 08:09 PM
[Edit: oops, I skimmed the original post too quickly and responded as though this were about a *acquisition* task. To amend my earlier response, suppose the acquisition I talk about is triggered by the generation task's signal "/Dev1/ao/StartTrigger" (or possibly "/Dev1/do/StartTrigger"). Then the initial t0 from *that* acquisition task should give you a good timestamp for the start of your generation task.]
-----------------Original reply below--------------------------------------------
Just expanding a little bit on the answer Bert McMahon already gave.
As I recall from either an article or a discussion here on ni.com many years ago, the t0 field for an AI task does not even get determined until you do your first read from the task. It's only at that point in time when DAQmx will query the real time-of-day clock, check the # samples in its task buffer and the waveform's dt (= 1/sample_rate) value, and use them to work backwards to its best estimate of t0 for the initial sample. Thereafter, it remembers the initial t0, and for every subsequent chunk you read it calculates the new t0 for that chunk from the initial one along with dt and the index for the first sample of that new chunk.
There's never a point where any kind of hardware latching happens to line up the time-of-day query with a a hardware signal like a trigger or sample clock edge. It's a software query done under the hood by DAQmx, and you're unlikely to improve on it for any conventional AI task.
Exceptions are less conventional tasks such as having an external clock, where DAQmx is stuck having to believe the sample rate you claim. Even in those kinds of cases, it's only *sometimes* worth the extra work it would take to get a better t0.
That's some extra background but the main point is just like Bert McMahan already said: for your task with a start trigger and internal sample clock, just believe the t0 you get from the waveform version of DAQmx Read.
-Kevin P