LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Measuring the time between two events

Hi, 

I have the task of creating a program that records the time between when a user sends a signal to a device, and when an analog input voltage reads 50% of the maximum voltage that it can read. Basically, a user sends a signal to a device, a timer starts, and when an input reads 50% of its max input, the timer ends. 

The time will be under 100ms. 

I have read through many forums, but I am having trouble finding a good place to start. Is there a good way to do this in LabVIEW?

 

Thanks for your help!

0 Kudos
Message 1 of 2
(2,664 Views)

"Is there a good way to do this in LabVIEW?"

 

Yes, and no. If you are using a desktop operating system (Windows, Macintosh, or Linux), you will probably have enough timing jitter to create errors in your measurements which may be larger than you want.

 

However, there are some alternatives which might work for you.

1. Use a real-time operating system.

2. Use a multichannel data acquisition device. Record the analog input voltage on one channel. Record the trigger signal on another channel. Determine the time by analyzing the two channels of data. Using hardware timed analog acquisition will result in both signals being measures with the timing accuracy and resolution of the timebase on the data acquisition device - which is almost certainly better than software timing. For example using a 1 kHz sampling rate will give 1 ms timing resolution.

 

You can probably implement suggestion 2 with the device you have now.

 

Lynn

0 Kudos
Message 2 of 2
(2,641 Views)