I use a modified version of the ContGenVoltageWfm_IntClk sample (Measurement Studio 8 & Visual Studio 2005) on a <meta http-equiv="Content-Type" content="text/html; charset=utf-8"><meta name="ProgId" content="Word.Document"><meta name="Generator" content="Microsoft Word 9"><meta name="Originator" content="Microsoft Word 9">NI-PCI-6731 card. Our goal is to output a signal continuously with a maximum precision. By default in the example, ouput frequency is at 1kHz (samples per second) and the precision is good. If we used a higher ouput frequency ie 22kHz, the real output frequency seems to be higher : 22.022kHz, there is a glitch. Is there an explanation for that ?