LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Error when running Real Time simulation

I have a third party computer with an NI PCI 6229 data aquisition card in it.  I am trying to use this computer as a real time target.  I am creating simulations using Simulink and converting them using SIT.  I configure them using the SIT Connection Manager.  I can successfully deploy simulations to it when no hardware is mapped.  When I try to map analog inputs and outputs to the card in the machine, I get the following error:

 

Error -209802 occurred at Driver VI >> DAQmx Wait For Next Sample Clock.vi:1

Possible reason(s):

Measurements: DAQmx Wait for Next Sample Clock detected one or more missed sample clocks since the last call to Wait
for Next Sample Clock which indicates that your program is not keeping up with the sample clock.


To remove this error, slow down the sample clock, or else change your application so that it can keep up with the
sample clock. Alternatively, consider setting the Convert Errors to Warnings property to true and then handling the
warning case appropriately.

Task Name: SIT_PCI6229_ao0-1

 

 

Has anyone seen this error or know anything about making the sample clock and the simulation clock run at the same rate?

0 Kudos
Message 1 of 7
(3,451 Views)

Hi Eric,

 

How exactly are you mapping your analog inputs and outputs to the card? What are your results when you set the Convert Errors to Warnings property to true?

Will
CLA, CLED, CTD, CPI
LabVIEW Champion
Choose Movement Consulting
choose-mc.com
0 Kudos
Message 2 of 7
(3,411 Views)

I am mapping them using the SIT connection manager.  It generates the code automatically.  I am not sure how to convert the errors to warnings.  I know that the DAQmx Wait For Next Sample Clock.vi is in the driver vi that is generated when SIT connection manager builds everything, but I am not sure how or if it is wise to modify this automatically generated code.  Also, it seems that when I increase the step size in the simulink model and rebuild, the simulation will run on the real time target with the correct inputs and outputs.  I have to increase the step size to about .002 to get it to work.  I'm pretty sure that the card is capable of handling a faster sampling rate than this.  Is there some sort of clock speed or sampling rate setting in the driver vi or somewhere else that I am missing?

 

Thanks,

 

-Eric

0 Kudos
Message 3 of 7
(3,399 Views)
It seems like the problem is not with the card itself, but the fact that communication cannot be made to the card in time. It could help to reduce the complexity of your model in order to provide the additional time for communication with your card.
Will
CLA, CLED, CTD, CPI
LabVIEW Champion
Choose Movement Consulting
choose-mc.com
0 Kudos
Message 4 of 7
(3,373 Views)

Hi,

 

were you able to solve the problem? I know the discussion here is a little older, but I got the same problem and I do not know how to get above the 1kHz. My hardware should be capable of doing this, since I use a new desktop computer and a I/O NI X-Series card. The Matlab Dll is a simple model.

 

Greets,

 

Jan

0 Kudos
Message 5 of 7
(3,049 Views)

The problem ended up being that my hardware was not fast enought to handle the complexity of the simulation at the speed I wanted.  I solved it by increasing the step size in the model to a point the hardware could handle.  It wasn't an ideal solution, but my only other option was to buy new hardware.

 

-Eric

0 Kudos
Message 6 of 7
(3,044 Views)

Hi,

 

I managed to solve the Problem of not being able to increase the samplerate of my desktop RT-System. I exchanged the Asus motherboard and an i5-CPU with an older Asus Motherboard and an Core2Duo-CPU. I cannot point my finger on it, but the problem is solved. A simple *.dll for testing the samplerate was running with 1kHz on the i5-CPU and with 30kHz on my Core2Duo-CPU. I think the chokehole was somewhere on the motherboard which was compatible with the NI hard- and software, but was just not able to run faster than 1kHz.

 

I hope this hint might help someone with similar problems. The hardware definitely should have been able to manage the calculations. The computing power of the CPU was sufficient, but the state of the art system seemed just not as "compatible" as the older system configuration. 

 

Greets,

 

Jan

0 Kudos
Message 7 of 7
(2,975 Views)