LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Slow simulation time.

Solved!
Go to solution

I was wondering if exist a way to improve the simulation performance. I had done some modifications, but the model execution still slow.

I have similar model on simulink that runs almost instantly, but in LabVIEW it takes some time.

 

 

PS: LabVIEW 2015.

0 Kudos
Message 1 of 4
(4,834 Views)

What toolkit are you using?  I recognize none of the functions in your Block Diagrams ...  [Oh, isn't there a Simulation toolkit, which I've never used?]

 

I've simulated simple systems (usually hardware, such as a robotic arm, for which I'm trying to develop code), and found it fairly straight-forward.  I decide on a delta-t, write down how my simulated variables will "advance" with time, then decide if I want the code to run "as fast as possible" (in which case I put it into a For or While loop) or in "real-time" (in which case I put a "Wait until next Ms" function inside the loop).  Your model may be more complex, but it would surprise me if it was so much slower than Simulink ...

 

Bob Schor

0 Kudos
Message 2 of 4
(4,813 Views)
Solution
Accepted by topic author uChiron

I looked into your simulation and tried different changes to speed up the process. Tried different solvers and values, but apparently not much can be speed up because:

 

a) your signal has 'burst' (oscillations around a value) that are very high frequency;

b) you are simlating for a large final time (86400 s)

c) you are collecting the data every 'continuous timestep.

 

Here are some suggestions to try to make it faster (other than use a faster machine...):

a) Try to increase the Maximum Stepsize of the simulation to 10000. This will allow the solver to jump bigger steps if he can (there are some linear parts on the system)

b) try to reduce the final time. Right now, you have ~15 oscillation that seemed to be the same. If you just allow 4 (like 25000), this would make it faster.

c) Try to make the 'collector' discrete and set a value that will capture the information you need. Less points to caption, faster it would work.

d) Try the 'stiff solver' like the Rosenbrock or BDF. Those seemed to be the best options here.

 

With those changes, I was able to simulate 25000 in 4.1 s and with the whole 86400 in 13.4 s. I am using a Dell machine (Intel Xeon CPU E5-1620 @ 3.70 HHz 3.69 GHz with 32 Gb memory).

 

Barp - Control, Simulation, RTT and HIL - National Instruments
Message 3 of 4
(4,790 Views)

The attached model is just a "trimmed down" version (The original model is much more complex).

 

a) your signal has 'burst' (oscillations around a value) that are very high frequency;

b) you are simlating for a large final time (86400 s)

c) you are collecting the data every 'continuous timestep.

 

A) The burst is modelling the Battery EOC. That frequency is desired

B) Yep. The model is simulating ~15 satellite orbits (near one day)

C) My boss told me that i could collect data at every 5 seconds. So, it's acceptable.

 

a) Try to increase the Maximum Stepsize of the simulation to 10000. This will allow the solver to jump bigger steps if he can (there are some linear parts on the system)

b) try to reduce the final time. Right now, you have ~15 oscillation that seemed to be the same. If you just allow 4 (like 25000), this would make it faster.

c) Try to make the 'collector' discrete and set a value that will capture the information you need. Less points to caption, faster it would work.

d) Try the 'stiff solver' like the Rosenbrock or BDF. Those seemed to be the best options here.

 

A & B) Some signals will change over time due the satellite operations, so i can't do this.

C) Yes. i will do it.

D) Already using.

 

 

Thanks.

0 Kudos
Message 4 of 4
(4,757 Views)