I’m using an NI USB-6211. As presented in thee attached Delay.vi, I need to measure the delay between two different types of signals. The first one is a digital step for controlling the closing of a contact. The second one is an analogic signal given by a tone of 1kHz at around -6dBm which can be generated only after the closure of the contact.
For what concern the digital signal I found a solution with the while loop continuously polling the value until I found it true or false. But I have some problems with the analogic one. Since the first estimation of the delay is grater than the period of the sinusoid a correlation-based method is difficult to apply. So I created a dBm subVi for computing the dBm value of a subset of the signal value and then evaluating when the acquired signal exceed the selected threshold (I know I can probably obtain the same with Basic Level Trigger Detection VI but I was not sure how to properly use it in the vi). Now I noticed that while the delay of the digital signal is about the same of the one measured with the oscilloscope, the analogic one showed always the same amount of delay which changes in relations to the frequency of acquisition and the number of samples of the generated tone. I don’t understand if this issue is related to the dimension of the buffer in which the data are saved by the DAQmx Vis or to how those buffers are managed, so I’m here asking an help not only for understanding the buffer management and how I can control it, but also for asking an example in the Basic Level Trigger Detection VI or an help in improving my vi for the measure of the delay.
Thank you for your kind attention!
You are "in luck" -- with the USB-6211, you can do everything you want to do directly with a simple While Loop (no Timed Loop required, no special "wait for TTL signal" to start, etc.).
This is a good time to suggest you do a Web Search for "Learn 10 Functions in NI-DAQmx and Handle 80 Percent of your Data Acquisition Applications" (or a title very similar to that). I also recommend that you look at the Examples that ship with LabVIEW, under "Hardware Input and Output", "DAQmx", "Analog Input". You'll find examples that show that you can use a Digital Input line as a "PFI" line (consult your 6211 Manual) that can serve as a "Start Trigger" for Analog Input. You can (and should) also use the built-in clock in the 6211 to time your Analog Acquisition -- the PC clock, in particular, should not be used as Windows is not a particularly "friendly" Real-Time (deterministic) system.
You should be able to write a VI with a few DAQmx functions that lets you use one DIO line (configured as a PFI line) to start your A/D acquisition, set the clock rate and number of samples to acquire (say, 1 kHz, and 1000 points at a time, meaning the While Loop "waits a second" for the data to be acquired, then delivers 1000 points for you to use, then (if its the only thing in the While loop), keeps acquiring the next set of 1000 samples, never missing one as it has an internal buffer to hold them).
Best of all, you can open MAX, configure your A/D hardware using the Test Panels, and try it out. DAQmx is really wonderful and easy to use, once you get the idea. I found "Learn 10 Functions" a real boon to understanding DAQmx when I first read it (more than a decade ago, I think).
Bob give the rigth hint 🙂
beside that you migth take a samplerate of maybe 10 kHz ... if your signal is a 1 kHz sine 😉