I've got two NI PCI-6541 boards that execute the following LabVIEW pseudo-code.
for i = 0 to 1
{
Execute these VIs: niHSDIO Init Generation Session.vi
niHSDIO Assign Dynamic Channels.vi
niHSDIO Configure Sample Clock.vi
niHSDIO Export Signal.vi
niHSDIO Configure Generation Mode.vi
niHSDIO Export Signal.vi
niHSDIO Configure Generation Repeat.vi
Set these properties: ExportedSampClk.Mode
SampClk.Impedance
RefClk.Impedance
SampClk.Rate
Execute these VIs: niHSDIO Write Named Waveform (1D U32).vi
niHSDIO Write Script.vi
niHSDIO Get Session Reference.vi
}
Execute these VIs:
niTClk Configure For Homogeneous Triggers.vi
niTClk Synchronize.vi
niTClk Initiate.vi
********************************
For some reason, the first VI called (niHSDIO Init Generation Session.vi) sets all my DIO board's outputs to high. They are not set correctly until the last VI (niTClk Initiate.vi) is run.
How do I make this not happen?