LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

High Sample Rate (~1MS/s) Timing Considerations

Solved!
Go to solution

Hi all,

My experience with Labview is pretty limited, and so far I've mostly been editing and debugging code that has been handed down to me. I've been using USB DAQ cards to sample at around 10 Samples/second, and haven't cared too much about precise timing. I'm looking at a new application for NI software and hardware, and have identified the USB-6343 card as the one to use (500kS/s input, 900kS/s output). The plan is to generate an accurate output signal (accurate in timing and amplitude) which is slightly dependand on the input signal. I do not plan to write log data to a file at this rate, just input, process, and output.

 

Is it relatively straight forward to sample and output analog signals at these high sample rates? Currently I just use timed loops with millisecond precision, but this would require microsecond precision timing. Is that something that is pretty standard to achieve, or are these sampling rates achieved a completely different way? Can anyone provide links to point me in the right dirrection?

 

I'm sorry if this question has already been asked, but hopefully it's generic enough to not be a burden. This is a project I'm taking up in my spare time, so I'm just beggining in plan things out, which includes figuring out how difficult the software side of things is going to be.

 

Thanks.

 

-A

0 Kudos
Message 1 of 4
(3,667 Views)

Let me understand:

- You're sampling some signal at 500 kS/s,

- you're making some calculations on it,

-  and within 1 microsecond from getting input sample you want to put output sample based on those calculations?

It is possible to achieve, but you need FPGA for this.

0 Kudos
Message 2 of 4
(3,655 Views)

Sorry, I wasn't clear. Right now I think I can accept a delay between the input and output. I agree that if I needed rapid processing, I would need an FPGA setup.

 

Let's assume I can accept a ~10-100ms latency between input and output signals. I'm sampling and outputting what are essentially 22kHz sinewaves (however in practice there will be significant modifications to the sinewaves).

 

I've worked with the millisecond clock before in Labview to accurately timestamp measurements, but I also didn't care if there was an error in the time measurement. Would something like JJ Control's Micro Clock http://sine.ni.com/nips/cds/view/p/lang/en/nid/211067 be a good place to start?

 

I'll have to figure out how to handle the processing stage of things (probably handling the input data in batches to effect the output in chunks, though the output will still require high time resultion), but to start I want to be able to read in analog values at 500kS/S with ~microsecond accurate timestamps and also output an analog signal at 900kS/s.

 

(At some point I'll likely bring in a more skilled Labview developer, but as I said, I'm working on this in my free time for now to get it off the ground.)

 

Thanks.

0 Kudos
Message 3 of 4
(3,645 Views)
Solution
Accepted by MDI-AJT

The DAQ acquisition clocks will give you the best timing you can get. If you retreive the data as waveform, t0 represents the start time and dt the time interval between samples. If you read continuously, all the data samples are timed relative to t0. The accuracy of dt is equal to the accuracy of the timebase oscillator on the DAQ device, typically +/- 100 ppm.

 

I do not recall the clock frequency on the USB-6343 but many devices use 80 MHz. You may not be able to get exactly 900 kHz from the 80 MHz source because it is not an integer ratio. 80 MHz/900 kHz = 88.888...

 

The use of the millisecond clock is called software timing and is neither fast enough nor free from jitter to do anything like what you have in mind.

 

Lynn

Message 4 of 4
(3,637 Views)