I am currectly using the control & simulation loop to simulate the behaviour of what is essentially a spring-damper-mass system. In the process the change in time (dt) is being used to integrate an arbitrary value. I am using a built in memory function to store the time, to calculate the time change (dt).
The simulation is rather complex, due to the necessary accuracy needed, not all the ODE solvers can handle it. Currently I am using Adams-Moulton method, this works fine for the simulation. However it cannot detect the change in time, the change is constantly zero. This problem worked it self out by using another ODE solver, but then the simulation was rather messed up (even when I tuned the step sizes and tolerances). So I am quite confident that Adams-Moulton is one of the best suited ODE solver for the problem at hand.
Is there another way to store the previous time and use it calculate the time difference, than using the memory function? Has anyone experienced such problems before?
I have been doing alot of error searching using the probe, but I am quite sure that there is a problem with the ODE solver and the memory function. See picture below, showing in basic how the change in time is being calculated.
I am rather new to LabVIEW, so if there could be something else I have missed I will be glad to hear it.
PS! I have tuned the minimum step size/relative and absolute tolerances for the Adams-Moulton to simulate the behaviour of the system correctly.
Solved! Go to Solution.
I am trying to recreate the problem. What simulation parameters and timing parameters are you using in your control & simulation loop? Can you upload your VI?
I am sorry, I can not upload the VI some of the content is confidential. I have attached a larger picture of the section were the change in volume and time is calculated, dV and dt. Also I have marked the two memory functions used. Hopefully this can help.
- ODE solver Adams-Moulton
- Relative tolerance 1e-8
- Absolute tolerance 1e-7
- Minimum step size 0,0005
- Maximum step size 0,01
- Initial step size 0,01
- Auto discrete time On
- Decimation 0
- Synz loop to timing source off
It turns out that the ODE solver struggled, due to two "look up table 2D" functions being used. These was set to interpolation/extrapolation, that caused a discontinues problem and the ODE solver couldnt solve properly, hence the memory functions not working properly either.
By increasing the array manually, I could use method nearest instead, with equally good results as interpolation.