Tim,
I'm glad to see that old posting of mine (
...appropriate buffer size...) got some use! It sounds like your app is tougher than mine though b/c you need to change the waveform freq on the fly.
As far as I know, the buffer size cannot be manipulated while a task is running. So changing the output rate does sound like the way to go. The advantage is that the waveform frequency changes instantly. Unfortunately, the frequency precision is limited (as stated earlier in the thread) by the 20 MHz timebase.
Consider the example given -- 800 kHz update rate so that 480 sample points produces exactly 9 cycles of a 15.000 kHz waveform. Well, that 800 kHz update rate is created by dividing the 20 MHz clock down by 25. The next closest achievable discrete output rates are made by dividing by either 24 or 26, i.e., 833.33 kHz or 769.23 kHz. This in turn would change your waveform from 15.000 kHz to 15.625 kHz or 14.423 kHz. Not very precise.
I'm inclined to think that you may need to perform a non-regenerating output and then continually write to the output buffer on the fly. For example, I'd start by creating a larger-than-strictly-necessary output buffer. Then I like to imagine the buffer as though divided into 1/3's. I start by writing enough data to fill 2/3 of the buffer. Thereafter, each time the actual output reaches a 1/3 buffer size point, I write an additional 1/3 buffer of data.
Now let's further suppose that the entire buffer represents about 3 seconds worth of output or 2.4 Mega-samples. (This is a tradeoff between how fast the actual waveform output freq responds to requested changes, the amount of CPU consumed monitoring and feeding the output buffer, and the memory and CPU required to generate chunks of waveform data). So each write to the buffer should represent about 1 second. Let's also suppose that every write to the buffer must start from a positive 0 crossing and contain an integer number of waveform cycles. At the moment I'm imagining a pure sine wave for simplicity.
So if we start with the 800 kHz update rate and 15.000 kHz waveform, then we want to deal with chunks of about 800 kilo-samples. But since we also need some multiple of 480 samples, let's pick the multiple 1667. This makes 800.160 kilo-samples or 15003 cycles of the waveform. Write these twice for a total of 1600.320 kilo-samples in the buffer, then start the task.
Now we poll the output properties until about 800 kilo-samples (1/3 the buffer) have been written. Then we write an additional 1/3 buffer chunk (presently 800.160 kilo-samples). Etc. Note that you'll need to be a little fancier than this to make sure you're always filling in the 1/3 of the buffer that starts about 1/3 buffer size ahead of the current output position. With the numbers given, the 160 sample per chunk discrepancy would eventually cause a problem.
At some point, the user or program decides that a new waveform frequency of 13.710 kHz is needed. Still using the 800 kHz update rate, I come up with a ratio of ~128 waveform cycles per 7469 samples. (The actual waveform freq will then be 13.710001 kHz). I'll again need a multiple that comes near to 800 kilo-samples, and I come up with 107. So I generate an array of 107*7469 = 799.183 kilo-samples representing 13696 cycles of the waveform, and write it to the output buffer. I keep writing this chunk approx once per second until another waveform freq change is needed.
That's the basic idea. With this approach, the data written to the buffer ranges from 1-2 seconds ahead of the actual output DAC. So if you request a change in waveform freq, there'll be a delay before seeing it at the output.
In summary: output rate change gives you instant response but fairly poor frequency precision. Buffer writing on the fly gives you excellent frequency precision, but a significantly delayed response plus a lot of memory, CPU, and bookkeeping to deal with.
-Kevin P.
CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).