I'm working on a programing using VC or VB to read the binary file generated by high speed data logger. I'd like to know how they scale the int16s into reals:
divide by # bits of A/D, multiply by range of input, multiply by EU conversion, add offset
Is that it? Is the input setting should be obtained from group channel setting recorded in the file header?
One more question how is the data arranged in rows or columns if I have multichannel inputs?
Thanks