I am using NI Gmath Library where I am having an "error -2341: Simulation Parameters: Simulation time" in ODE VI.vi. The VI is working fine on MAC OS X on Macbook Pro, however giving the above error on Macbook Air. The OS on both of the systems is same, i.e. OS X EI Capitan 10.11.3.
I am also having the same error when I run the VI on Windows 7 installed as a dual OS on MACbook Pro.
Is there any hardware issue expected? Because same code is working on Macbook Pro on MAC OS (not on windows 7) but not on Macbook Air.
Any help is appreciated.
Solved! Go to Solution.
There are not a lot of Mac users on the list so it may take longer to get a reply to such a specific question.
One thing which would help is if you would post your VI with all of the input values saved as default. That makes it much easier for us to attempt to reproduce your problem.
What version of LV are you using? As you are probably aware, El Capitan is not officially supported by LabVIEW. I have one computer with El Capitan and two with Yosemite, so I can check your code if you post it.
Thank you for your reply.
The simplified code is attached.
Please be noted that I have double checked that the attached code is working perfectly fine on "MAC OS X El Captian 10.11.3 on Macbook Pro". However the same code is producing the error "-2341 - Simulation Parameters: Simulation Times" on the following two systems
1. Windows 7 running on Macbook Pro as a dual operating system
2. MAC OS X El Capitan 10.11.3 on Macbook Air.
I am using LabVIEW 2014.
Hope to hear from you soon.
I have it running on my Yosemite computer. I will try the El Capitan machine a little later.
Some things I noticed:
1. Your Simulation Parameters cluster uses a different cluster order than the typedef cluster on the ODE Solver. While this may not be an issue, it would probably be better to use the typedef.
2. When I set the Simulation Parameters values to the default values for ODE Solver, I get a different result. The defaults produce three rows on the outputs but your values only have two. The second and third row values are very similar to each other.
3. Unless you are inserting data into the middle of an array, it is almost always better to use Build Array than Insert into Array. In this case I think it also makes it very clear what is happening when you use Build Array. (To make the output array 1D you need to right click on Build Array and select Concatenate Outputs.)
4. Why do you need the for loop? Nothing changes from one iteration to the next. Even if you plan to change something in a more complicated VI, the Build Array(s) and ODE Substitute Parameters.vi should be outside the loop because none of their inputs change during operation of the loop.
Thanks for your reply.
1. Yes it is not the source of problem.
2. It is because I have set the time step = initial time step = 0.1. The default value for time step is 0.1 and for initial time step, it is 0.01. So in every iteration for default settings, the results are calculated for t=0,0.01 and 0.1
3. You are right. I will do that.
4. I have reduced down the complexity of this code and made it very simple. In the original version, the values are loop back through shift registers. The "Mod Value" are actually loop back in each iteration therefor the buid array (or insert into an array function) has to be placed inside the loop.
But all these are not the source of a problem I am having. I am just wondering why is this working on one maching and not on the other. In fact working fine on MAC OS on Macbook Pro but not on windows OS on the same machine.
I will be grateful if you manage to try this code on Windows platform as well.
Thanks for your time. I really appreciate that.
I think that the cluster order may be part of the problem. The complete error message says: "The initial time of the simulation cannot be greater than or equal to the final time." The values in the Simulations Parameters cluster on the front panel do not violate this condition.
However, I created an indicator by popping up on the simulation parameters input terminal of the connector pane of ODE Solver.vi. When I wire that to show what the Solver VI sees, I get Final Time << Initial Time.
Try replacing your Simulation Parameters control with one created from the ODE Solver input.