I'm trying to create a labview program so that my daq can output 6 utility bits and then acquire 1 analog signal (AI0) while these bits are on. I am reading the data through a csv file. The data is a matrix of 6 columns (bits) and each row is a separate input. I'm trying to read the output analog signal for each input.
I'm new to labview and having difficulty with timing. I can send in my data fine. I just don't quite understand how to set up my model so that once the input is set, only then will the output write to file for each iteration.
I have attached what I've mocked up so far. I apologize, it is quite messy, I was trying different methods (DAQ assistant to mxDAQ).
How do I set the timing from when i enter the bits through daq to reading the output