03-11-2011 02:11 PM
(At first, sorry for my English but I hope that you will figure out what is my problem).
We have a simple system which consists of a Labview and a Matlab - Simulink running on single PC. Via Labview we are measuring data and in Simulink we have designed a simple regulator. Labview is communicating with Simulink via SIT 5.0. Our problem is, that Labview (or SIT) is sending about 50k samples per 10 seconds to Simulink but we need it to be about 1k samples per 10 seconds. In simulation parameters in Simulink no matter what value we set as "fixed step size" the sample rate seems to be always the same. With fixed step size set from 0.1 to 0.001 still about 50k samples per 10 seconds are sent to simulink. So my question is what is the sample rate/ or sampling time of communication between labview and simulink via SIT or where can be this sampling time/sampling rate set. For our needs it will be the best sampling rate about 100 samples per 1 second. We are running a simple "simulation environment". Decreasing or increasing the sampling time which are the data measured is not an option because we have different types of sensors.
In summary we need to set the sampling time on which is SIT running manually or somehow/someway rapidly decrease the number of samples sent to simulink via SIT.
Thanks for responses.
Best regards, Mark.
04-06-2011 05:51 PM
Sampling time can be controlled simply through loops speeds.Could you explain a little further how you would like to control sampling speeds?
04-07-2011 07:40 AM
Thanks for your reply. By now, our solution for previously described problem is to build a dll from simulink model, and set the fixed step value in simulink configuration parameters at same value as the sampling period with whitch we are measuring data. For example if we are measuring pressure with 0.1 sampling period, setting fixed step as 0.1 helps. It means that the amount of samples sended from LabVIEW is the same as the amount of samplest that have had arrived to simulink.
But we have noticed that there is difference in results when calculating polynoms in simulation environemnt (when the model is running in simulink and it is controlled by labiew via SIT), and when calculating polynoms via previosly builded dll. Simulation environment is more accurate. For now we are going to try different solver methods.
The main problem is that we are using PID regulator designed in simulink and when it is running in simulation environment the integration part of PID brings incorrect values that are higher as they have had to be.
It seems to that setting the correct loop speed will be the best solution for us. Can you describe it little more for me? Or any else informations about it will be wery helpful.
04-08-2011 06:03 PM
Would you mind describing how you are validating that you receive 50K woth of data. Also, would you mind posting your code so that I may take a look?
07-18-2012 06:46 PM - edited 07-18-2012 06:46 PM
I think I'm going to need some more information in order to understand the issue you're seeing.
What is your setup like? Are you running in the simulation environment and interfacing with your .mdl file? or is there hardware involved and your model has been built into a .dll file?
What do you mean by " the sample time remains the same"? Is this the sampling time of data you are bringing into LabVIEW or is it how often you send data to the Simulink model or is it how often you read data from the Simulink model?
What loop time are you changing? Changing how often we send data and acquire data from the model will change how often the model executes.
07-19-2012 05:08 AM
For now i am just interfacing my .mdl file from simulink to labview via SIT. My simulink model sampling time is 0.00001 sec. But my converted model in labview only shows a graph window sampling 500 samples per second. I tried to change all the things i got in Block Diagram still no effect, even if change my sampling time from simulink the labview gives me the same output.
I have also got further issues with my dll file for Real Time application. Kindly see the link below.
07-20-2012 06:51 PM
Hmmm...maybe post a screen shot of the loop (set to a higher speed) and the graph window only sampling 500 samples/second. I'm really not sure at this point why increasing the loop speed isn't affecting your program. How are you controlling the loop speed in LabVIEW. It sounds like something is keeping it from increasing.
I noticed from your other post that you switched from VeriStand to SIT. Speed parameters are much easier to set in Veristand, but I'll see if I can play with SIT some more and maybe reproduce the problem you are seeing.