I am having an issue with my X-Series DAQ (PXIe 6366) and LabVIEW 2009 SP1. I am trying to write my analog measurement input into a binary file, as I take that binary file and manipulate it. My concern has been that I think the DAQ is not reading in the correct binary values in order to do so. I probe the DAQmx Read output and the waveform I get is consistent with MAX, but somehow when I read in the binary file into my C++ algorithm, my results are flawed. I know that the algorithm works as it has been working for years. So, I suspect that my header file/scaling coefficients might be wrong, or that I am getting wrong binary representations of the values that I am reading in.
If anyone has any advice as to what I can do it would be much appreciated. I don't have the VIs until monday. And also, I am relatively new to labview, so simple terminology would be very nice.