LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

PCI-5124, PCI-6040, need to measure time between 2 pulses, delay, output pulse with microsecond accuracy

Hi all,
I'm running Labview 8.0, and have a PCI-5124 and PCI-6040E to use. My application is to trigger on 2 pulses (trigger#1 and trigger#2) with the PCI-5124 (pulses are about 200ns width, the 6040E does not trigger reliably on them) that will come at random times, but within about 500us of each other. Then, the time between the incoming trigger pulses must be measured to within about 1us. I will be triggering a laser to fire with a digital pulse output (presumably from the 6040) sometime after the trigger#2, based on this time that was calculated. Basically the flow goes as follows:
1) recieve trigger#1 (on PCI-5124)
2) receive trigger#2 (on PCI-5124) around 500us later.
3) calculate time between trigger#1 and trigger#2, and use this to calculate a delay time (DELAY)
4) Put out a digital pulse at time DELAY after trigger #2; DELAY will be about 1.5-3ms, must be accurate to within +/- a microsecond or so.

So far my main trouble is getting everything synchronized - the triggers come, and I get them by using Ni-scope functions to collect two very short samples (1 point). Then, I fetch the two 1pt waveforms and use the waveform output of absolute timestamps to calculate DELAY. I have a task where the 6040E count edges of a 1MHz square wave as a "clock", and try to output a digital pulse on the 6040 when the clock reaches the value of (trigger#2 time + DELAY). Some big problems are, the way I check the "clock",  the pulse out has a jitter with respect to trigger #2 of +/-50us, which is way too much. The fetch function itself also seems to take about 0.8ms even if I only fetch 1 point.

So I guess my most pressing questions are:
1) what should I use as a "master clock" for this application?
2) How do I make sure that once I calculate the appropriate time to put out the digital pulse (DELAY+Trigger time#2), a digital output comes at exactly this time (+/- 1us)?
3) Is there a faster way to calculate the time between the 2 trigger pulses than my "fetch and check timestamp"?



Help with any of these questions would be greatly appreciated; I'm new to Labview and I've been spending a lot of time trying to figure out how to solve this problem which sounded fairly simple in words. Attached is my latest crack at it, but I would rather start again with a more efficient way overall instead of forcing this to work if that's easier. Thanks for taking a look at this!

0 Kudos
Message 1 of 5
(3,667 Views)
There are two main parts to this issue:

1. Using NI-Scope to timestamp the pulses
2. Using NI-DAQmx to output a digital pulse

I would approach the issue in this way... Send your trigger signal into Ch0. Set up Ch1 to fetch off of the trigger on Ch0. You can set this up to only fetch 1 point. Ch1 is a dummy channel. Its purpose is just to return you a timestamp whenever Ch0 sees a trigger. Once you have received those two triggers you will be able to use the two timestamps to calculate your delay.

In the meantime, you should have set up your digital output. You can place your DAQmx Write inside some logic that will execute once the expected timestamp occurs.

This is just a sketch of how I would approach your application. Hopefully this is helpful!
Garrett H
National Instruments
0 Kudos
Message 2 of 5
(3,650 Views)
Thanks for the reply! I think by using a 2 record acquisition and the multi-fetch function, I can get my needed delay time soon enough to be useful. It's sort of like your suggestion, but I'm trying to save channel 1 on the PCI-5124 for collecting other data. My problem is, I have the counter going, and I calculate the delay, and this says something like "the YAG laser must be fired (requires a digital signal out, from the 6040) when a 1MHz counter reaches 1,684,432". So far, I've been using a loop with the read counter vi to constantly check and see if the actual count has reached the calculated fire time yet; when it does I have a case that puts out a pulse as a trigger. However, the counter is faster than the loop and it overshoots the correct firing time by some random time in the range of 1 - 12 microseconds. In addition, the Dig Boolean 1 Line 1 pt seems to take 280+/-20 us to put out an edge once it's been called. This jitter in the timing from when the counter reaches the desired value to when a digital signal is output isn't cutting it; our application needs the pulse to come out within maybe 4us or better of the calculated firing time.
Is there a better way to synch up something like digital output with the counter on the 6040?
Is there something that could make the dig I/O write to it's line faster, is it busy setting itself up to fire? I thought after "create task" it was pretty much configured to go immediately when needed.

-Jason V
0 Kudos
Message 3 of 5
(3,646 Views)
Jason,

I don't think you are going to be able to get the time precision you require with your current hardware. This is a very deterministic application and would be better accomplished with a Real-Time or FPGA solution. The main issue you are experiencing now is that you have to go to software to calculate the delay between two triggers. This puts your entire application at the mercy of your CPU and moves the potential jitter from nanoseconds to milliseconds. FPGA would allow you to keep this calculation in hardware and trigger with extremely high precision. You might take a look at the R-Series cards:

http://sine.ni.com/nips/cds/view/p/lang/en/nid/202005

Sorry. I know this probably wasn't the response you were hoping for!
Garrett H
National Instruments
Message 4 of 5
(3,629 Views)
Thanks for the hardware tip, although we do need the high speed digitizing power of the PCI-5124. I managed to partially get around the timing problem by using both counters on the 6040. The first one puts out about a 3ms "waste time" pulse that triggers off the first data trigger, which gives the software time to do its calculation, and send the result to  another pulse task as the pulse time. This second pulse triggers on the  falling edge of the "waste time" pulse, cutting the jitter way down from the software alone. Unfortunetly, if the waste time pulse is much shorter than 3ms, the second counter doesn't get its pulse time input fast enough and never goes off. Right now we're trying to reduce this 3ms problem, because the longer it is, the more data we'll miss.
Thanks,
Jason
0 Kudos
Message 5 of 5
(3,582 Views)