Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQmx Partial output of task samples

Solved!
Go to solution

Hi All,

 

I have a problem that I think should be easy enough to solve, but I'm currently at a loss.

 

I have 3 outputs that need to be synchronized (2x analog outputs & 1 digital output). I've managed this, using an internal clock to trigger the execution of these tasks. The problem is that I am reading data from a USB device during this waveform execution, and the duration of this read is inconsistent (varying from 0.5ms to 2ms, while my e.g. 525 samples are clocked out of the DAQ card within 5 ms).

 

My idea is now to "load" the waveform into the DAQmx buffer, but then to run only a portion of the samples, pause, read from the USB device, then commence from a point in the waveform, outputting another portion of the waveform, reading again, etc. until the waveform has been output in full.

 

For example: Total waveform is 525 samples, and I want to read the USB device after every 25 samples have been output, allowing 21 reads of the device.

 

I don't want to use the on demand method, as I've shown this is too slow for the application.

 

Could you please assist with an idea of how to implement this?

0 Kudos
Message 1 of 9
(2,553 Views)

What are your output and input devices?   If both are NI data acq devices, there's almost certain to be a way to generate and capture in sync and without interruption.

 

One method I commonly use is to share a sample clock among *all* the tasks, using physical wiring to carry the clock from one device to the other if necessary.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 2 of 9
(2,519 Views)

Hi Kevin,

 

I'm using a PXIe-6341 connected via MXI.

 

I am controlling 2 analog outputs simultaneously (so combining them in a waveform submitting output to: PXIe-6341/ao0:1) and a digital output. These 3 outputs are triggered by an internal counter task.

 

The acquisition device is not an NI device.

 

I was considering using HWTSP for the analog outputs, but seeing as I cannot put the digital output into HWTSP sampling, I'm not sure how to harmonize everything and get them working in step.

 

 

0 Kudos
Message 3 of 9
(2,503 Views)

Does the non-NI acquisition device offer any digital signals to help with sync?  Trigger In/Out?  Clock In/Out?  Anything?

 

I think I'm still not clear about your desired outcome.  I had assumed you wanted hardware-level sync of your input samples and output samples.  After re-reading your original post, now I'm not so sure that's your goal after all.

 

What are you doing here?  You seem willing to generate your outputs in a piecewise fashion -- output for a while, pause, output some more, pause, etc.  Is that what you *want*?  Or is it just what you're willing to be stuck with if necessary?

 

What's the relationship between your outputs and your acquisition?  I would think sync would matter so you know how to correlate the measured inputs to your generated outputs.  But your plan wouldn't accomplish that.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 4 of 9
(2,498 Views)

Hi Kevin,

 

That is a very astute comment. Yes, these solutions that I am after are NOT ideal.

 

The system is an X-Y scanner with camera acquisition.

 

I am currently running the 3 outputs (X-scan, Y-scan & camera trigger) and expecting the dll read request from the camera to be completed within a fixed period, so that I can build dead time into the waveform allowing time for that read.

 

I had previously run the system making a 2-signal task for just the X-scanner & camera trigger, triggering the tasks with the internal task and then reading the camera. Only once the camera read was completed did I apply the appropriate voltage to step the Y-scanner and loop back to restart the X-scan & trigger task. In this manner, everything is well synchronized, as I always know exactly where I am, but it is quite slow (perhaps the way that I do this is wrong) in comparison to running the Y-scanner on a task. (In my measurements, DAQmx write VI takes some 300us to execute, which adds up across the whole frame).

 

So that relationship between the X-scan analog output and the camera trigger is how I correlate the acquisition to the X-position. (Along with camera exposure time & the output rate of the analog waveform).

 

Am I thinking of the waveform outputs correctly, or is there another method that I should be pursuing?

 

Thanks for your time.

0 Kudos
Message 5 of 9
(2,492 Views)

There have been a lot of threads around here about XY raster scanning coupled with either imaging or photon counting.  I've participated in several though I haven't ever worked on such an app myself.

 

It seems to me that your camera image retrieval (and possibly the capture?) is the weak link here.  The solution others seem to gravitate towards is an imaging system with some frame buffering capability (or perhaps a GiGE streaming interface).  They also work at speeds governed by their slowest device.

 

As you've already found, decent XY scanning speed requires hardware-clocked output.  Earlier in the thread you mentioned generating 500+ XY samples in 5 msec.  That's ~100 kHz sample rate.  To keep up, you'd need an imaging system that can handle the corresponding image bandwidth implied by those sample rates for a duration corresponding to the XY scan field.  That strikes me as pretty unlikely.

 

I'd say the first thing is to figure out your camera system's capabilities and limitations, particularly in terms of time per capture & transfer.  From there, figure out whether your desired total scan time can be accomplished with that frame rate and your desired XY resolution.  If not, you'll need to decide whether to allow longer times, reduce your XY resolution, or make changes to speed up your imaging system (such as buy a probably much more expensive setup that can accomodate higher speeds).

 

But the way it looks from here is that your requirements are incompatible with your equipment's capabilities.  Something's gotta give.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 6 of 9
(2,487 Views)

Hi Kevin,

 

One of the reasons that we selected this device is the fact that it has very high data rates (a minimum of 300kfps 8-bit data). There is a buffer that stores up to 256 frames. In the mode that I described before, I read out the buffer at the end of each scanned row, in the hopes (yes, this is where my mistake originates) that the read dll executes consistently within a specific time frame. This is how I had intended to capture the full X-Y scan field, until I identified the read execution time variation.

 

This variation in read dll execution time seems to be as a result of Windows inconsistencies. I've attached a screen dump of a histogram of the read execution time that I did over 10k reads.

 

So the equipment is capable of delivering the scan waveforms and the digital outputs, while the frame rate from the camera is capable of delivering frames at a high rate... the problem is that read time. The idea to use HWTSP is an ugly compromise that I would prefer to avoid, but I'm at a loss for any other way to progress.

 

I have no experience in working with Real Time operating systems, but is there a way in which one could run this VI on an RT OS or RT device and achieve a more consistent read time?

 

Thanks for taking the time. I have nobody to bounce ideas off.

 

0 Kudos
Message 7 of 9
(2,476 Views)
Solution
Accepted by topic author ARL_

Is USB the only interface option to the camera?  Does the mfgr make any performance claims about sustained long-term frame rates?  Can you pay to have the buffer size expanded?  They're probably the best folks to talk with first -- it's their hardware and driver after all.  See what they recommend given your frame sizes, frame rate, and duration.

 

I'm pretty sure HWTSP can only slow you down.  You get errors if you don't keep up with the clock, and if you run the clock manually by creating individual counter pulses you're still getting software execution overhead.    You'd probably be better off with an on-demand AO task than HWTSP.

 

RT doesn't seem likely to be a solution because you'd need a camera driver that was compatible with the RT OS.

 

 

-Kevin P

 

 

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 8 of 9
(2,471 Views)

Hi Kevin,

 

Thanks again for your help.

 

I'm in constant discussion with the manufacturer, so there is a good relationship there, as it is a cutting edge device.

 

I'll spend more time measuring all the outputs that I have to reinforce any requests that I make of them.

 

I'm going to mark your reply as the solution to this thread, even though it doesn't really answer the thread subject. Hopefully readers in the future would see and accept that.

 

0 Kudos
Message 9 of 9
(2,423 Views)