LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Finite output buffer size question

I am doing finite, simultaneous analog and digital pattern generation, using the sample clock as the source. It works beautifully, except that it insists on always generating 1000 samples. If I try to use >1000 samples it truncates my analog and digital data down to 1000. If I specify <1000 samples, it repeats the outputs in a loop until 1000 samples have been generated! I have an error cluster running through the whole vi chain, but I never get an error.
 
I'm using LV8.5 and a PCI-6211
 
This is what I'm doing:
 
(1) Use DAQmx Timing (sample clock).vi to define the clock frequency for AO0
(2) Use DAQmx Timing (sample clock).vi again with [Device]/aoSampleClock wired to the source to define sample clock as the source for the digital output generation.
(3) Use DAQmx Configure output buffer.vi to set the buffer size for the digital generation.
(4) Use DAQmx Write (digital 1D U8 1 chan N samp).vi to fill the digital buffer.
(3) Use DAQmx Configure output buffer.vi to set the buffer size for the analog generation.
(4) Use DAQmx Write (analog 1D dbl 1 chan N samp).vi to fill the analog buffer.
(5) Use DAQmx Start task.vi to start the digital generation (doesn't start until the sample clock starts).
(6) Use DAQmx Start task.vi to start the analog generation (now the sample clock starts).
(7) Wait for end of generation using DAQmx Wait until done.vi
(8) Stop both tasks.
(9) Clear both tasks
 
DAQmx Configure output buffer.vi is supposed to override the automatic output buffer allocation, and I'm assuming that the 1000 samples is the automatic allocation, so what else do I need to do?
 
0 Kudos
Message 1 of 2
(2,549 Views)

I solved the problem. I needed to use the DAQmx timing property node to change the samples per channel value. The 1000 samples was coming from steps (1) and (2) where I was setting samples per channel to 1000 as a starting value.

 

0 Kudos
Message 2 of 2
(2,536 Views)