Multifunction DAQ

Showing results for 
Search instead for 
Did you mean: 

Synchronize Analog Output Sample Clock Without TTL Pulse Count

Hi All,

The title is deliberately similar to this example program. The fundamental difference is that, instead of the DAQ hardware, an external module is counting the TTL pulses and outputs the histogram (amplitude vs time). I am using the two AOs of the USB-6211 to generate the 2D pattern for the two galvo mirror. The device is USB connected to the PC and it is provided with the LabVIEW drivers, that allow to change the acquisition time of the device (in the order of 100ms). Once the signal is acquired, the LabVIEW would then perform some signal processing and plot the max amplitude (a single value) on the spectrogram.


My question is: how can I sync the 2D raster pattern to the acquisition time of the device, to ensure that for each pixel of the spectrogram the histogram is correctly measured and postprocessed?

Starting from the example above, I would replace the counter in the while loop with the VIs relative to the counting device. One way would be to adjust the "Update rate" with respect to the acquisition time of the device, plus some extra time that I can figure it out by trial and error. However, I guess that this method gives no certainty that each pixel has been acquired correctly before the raster pattern moved to the next AO value.


All suggestions are welcome.

0 Kudos
Message 1 of 4

Not enough info.  We have no idea what your USB device is, how it works, what it supports, etc., etc., etc.


*IF* it supports some kind of external sync, your best bet is probably to use DAQmx Export Signal to route the AO sample clock used for your galvos out to an available PFI pin.  Then physically wire to your other device where it acts as a "pixel clock", pulsing once per AO sample.


If your device doesn't support such hardware-level sync, you can't count on getting lucky with your trial and error approach.  They'd be operating off of different timebases, which would differ from one another a little bit nominally and then also drift relative to one another with temperature, etc.


Hard to say more without knowing more about your other device.



-Kevin P

0 Kudos
Message 2 of 4

Kevin P, thank you for your reply. I agree the an hardware synchronisation is preferable to a software one but perhaps I didn't explain myself properly. It would be fine if for a given AO multiple histograms are acquired (and then post-processed), as long as the amplitude/time value is plotted on the spectrogram for that pixel before moving the next one in the sequence. The logical flow diagram should be "go the the next AO value for the raster pattern if and only if a value is effectively plotted (i.e. not NaN)".


In other words, the update rate should behave as the "latch when pressed" action like the basic VI attached. Of course, the correct acquisition of a new value would replace the action of pressing a button on the FP in order to go the the next AO value.


Would it be possible to achieve the logical flow above within LabVIEW, perhaps by enclosing the "Update rate" parameter with some structure (such as sequence or event structure) that controls its value? 




0 Kudos
Message 3 of 4

Still not enough info.  From the NI hardware side, you can definitely generate an AO pattern with known hardware timing, but I have no way of knowing how to correlate this to your external device.   If you have AO sample rates approaching 100 Hz or more, you'll need to be able to rely on hardware sync.   If you can only do software sync, I'd venture you need to be down in the 10 Hz or less kind of realm.


But that's just a wild guess -- it'll depend on the external device in question, the interface, the latency, etc.



-Kevin P


0 Kudos
Message 4 of 4