From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
12-14-2006 12:13 PM
I’m having a problem doing some continuous measurements, and I was hoping someone could point me in the right direction.
The basic setup is a PXI-4461 card, with an AO channel going through a DUT and into an AI channel. What I need to do is take a series of measurements, altering either the AO signal amplitude or a feature on the DUT between each measurement. In some cases, I need to examine the data from the previous read in order to setup the amplitude or DUT correctly for the next read. The AI/AO block size is set to # samples data needed + # samples to write ahead. The first time I read, I just get the # samples data needed. The ‘write ahead’ portion is there so that the AO device doesn’t run out of samples to generate before I can write more samples (as there is some processing to do between writes). Then the next read gets a whole block, but it is ‘write ahead’ samples behind the writing task and I discard the 'write ahead' portion of the waveform.
The problem is, if ‘write ahead’ is too short, I get an underflow error. If ‘write ahead’ is too long, I get an overwrite error. There isn’t much wiggle room between the two with the measurement settings I need (duration 1s, sampling rate ~40kHz), and I’m worried that depending on how much processing the computer is doing at the time, I will get the error.
What is the best solution? Increasing read buffer size (what is the recommeneded size if so)? Using multiple finite data acquisitions? Taking shorter measurements and appending the data of consecutive reads? The reason I didn't start with multiple finite data acquisitions is that I thought starting and stopping the tasks all the time would be inefficient, but perhaps it doesn't add that much overhead?
Thanks for any help
12-14-2006 01:00 PM
12-15-2006 09:30 AM
12-18-2006 11:59 AM