I wrote a code for calculating displacement from acceleration signals. I have USB DAQ which I used to measure two acceleration signals, I subtract the the two signals and use SVL_integration.vi for converting the acceleration signal to displacement. I tested the program with a known signal of 1 g sine wave and the displacement which I got was correct.
The problem I am having is that when I read a text file with two acceleration signals and then use the same program to do the displacement, it gives me a value which is less than a factor of 100. I am not able to figure out what is going on. I tried converting the time domain signal to a waveform and doing the calculation, but the result doesn't seem to be correct. Any insight on what is going wrong is really appreciated.
Thank you for providing a screenshot of your block diagram; it sounds like you are attempting to determine the displacement between two objects by finding their relative acceleration along a known axis and using this to determine total displacement over a given period of time.
I think there are a few things that could be going on here- if you're able to provide more information on the results you expect vs what you are currently seeing, a "bigger picture" description of what you are attempting to do, along with (if possible) a sample data set, and the VI(s) you are using, it would help narrow things down a lot and help someone comment on what you're seeing or how to get the results you're looking for.