LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Undeterministic/ Strange behavior in the Real-Time System

Hi All,

 

I develope a Real-Time system that collect data from the FPGA and transfer it to another PC.

It is a huge system with total of 30 channels using 4 FPGAs.

I have 4 VIs for the FPGAs, 4 VI for the RTs to read data from FPGAs, and a main program that take the data from the RT (stored in Global Variable) and send to other system.

 

4 FPGAs: works fine and stable, nothing to complaint about. it Works every single time.

4 RTs: each RT has 10 channels running at the same time. Look like this picture below. It involves: SEQUENCE STRUCTURES, WHILE LOOP, CASE STRUCTURE, and  TIME-STRUCTURE for each channel. Important note. If the condition is True, it will run as 10 Hz, very low CPU. But if it didnt pass the condition, the false loop has to run as 1000 Hz, CPU usage become almost 100% for all of my CPU cores.

 

WEIRD THING IS: If I change any small thing that has nothing to do with the program's logic such as change indicator name, delete indicator, or even move indicator around on the Front panel, it will result all channels work or none of them works, or 2/3 of them works. A small twig will result a huge change in the program and I have no idea why.

 

QUESTION:  Have you seen it before?

                      What kind of problem is that?

                      Any suggestion on how to fix it/ improve it?

 

 

Thank you very much, in advance.

 

Mindy

 

 

 

 

 

0 Kudos
Message 1 of 16
(2,733 Views)

1. How are you getting the data from the FPGAs?

2. What exactly do you mean by "doesn't work"?

 

Just as an observation, you should be reading the data from the FPGAs using DMA FIFOs and then sending the data to the main RT VI using queues.  These are streaming type of data transfer, which it sounds like you want.  Otherwise there is no point in having a set sample rate.  Then sending to the other systems should probably be a Network Stream.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 16
(2,714 Views)

Hi crossrulz,

 

1. I'm getting data using DMA FIFOS

2. In the front panel of the RT, I have the indicator that tells what state it was in the case structure and if it is a correct data (code words match)

    It doesn't work mean some of the channels are stuck in a wrong state and didn't move on from the default state even though there are no condition in the default state.

 

When I change even a very small thing that is totally unrelated to the system (says: indicator's name), some channels will be stucked in the wrong state, if I change the front panel indicator name back, it works again.

 

Problem is that I cannot get all of the channels to work at the same time, and the CPU usage is avery high.

Would it possible that because the CPU is too high, some functions of the RT are not working correctly and cause some channels to stuck at the default case?

 

Thanks,

 

Mindy

 

0 Kudos
Message 3 of 16
(2,646 Views)

We are going to need to see some real code in order to help you diagnose this.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 4 of 16
(2,633 Views)

We have no idea which RT system you're using.  Most are single core.  Some are dual core. You could have setup your own PC and then we know even less.  If you run a single VI, what do you see for CPU usage?  Would we expect two threads running on the same core to cause issues with that percentage?

 

 

0 Kudos
Message 5 of 16
(2,622 Views)

Hi Natasftw,

Thanks for your response.

Here is some information about the system I'm working on:

 

Using NI-PXIe8135 chassis, FPGA using FlexRIO, PXIe-7962R and 7961R, NI 6584 (we call it the  Gold-finger adaptor).

The PC system has 4 CPU core

 

When I only run 1 FPGA and RT at a time, if it works, the CPU usage is 20 % in TOTAL

If it is not work, it stuck in a case machine that run at a faster loop, therefore the CPU usage for 1 FPGA alone is ~240% for the total of 4 CPU cores.

 

New update: It seems likes the problem occurs because I didn't close the FPGA properly. If I turn off and reconnect the chassis again everytime the program, it seems to have more channels works and all channels works at some point. If all channels are not work, I will just restart the chassis again.

 

Do you suggest any good way to turn of the program after it uses to make sure that multiple things are not running at the same time?

I just use the FPGA close function.

 

Thanks,

 

Mindy

 

0 Kudos
Message 6 of 16
(2,606 Views)

Hi crossrulz,

I apologize that I'm not allowed to provide you the real code of this project.

I just can explain it the best I can. Hope you will understand.

 

Mindy

0 Kudos
Message 7 of 16
(2,603 Views)

From what I have gathered, you are reading from these channels in the FPGA putting them in a DMA FIFO and then reading the DMA FIFO on the LabVIEW RT. Is this correct? Can you clarify where exactly the break down is happening? Are the channels no longer putting data into the DMA FIFOs? Or is the RT VI never getting to the part in your code where it is reading from the DMA FIFO? Can you clarify what state are you expecting the channels in and what do you mean by the channels are stuck in the default state?

 

Are you changing things on your front panel while the RT program is running? Do you always see a CPU spike every time the channels don't work? Is the Real Time VI doing anything other than just taking data out of the DMA and sending it to a PC computer? What are your other resources on the RT machines doing when the channels “don’t work”? Does anything seem abnormal besides the CPU spikes?

 

 

A Johnson
Applications Engineer
National Instruments
0 Kudos
Message 8 of 16
(2,549 Views)

Mindy,

 

     I don't have the same hardware that you use (I use a PXI chassis, somewhat older technology), but when my Host (PC) code decides to stop the program, it tells the RT code to run a System call that reboots the PXI, effective restarting it.  This is the sub-VI that the RT system executes --

Restart Remote Self.png

You can find Restart.vi on the Real-Time Palette, RT Utilities, System Configuration, Software (it is pretty well hidden ...).

 

Bob Schor

Message 9 of 16
(2,524 Views)

Hi All,

Thanks for your replies.

I have some quick updates. It seems like the RT sources are not being closed properly. Maybe when I halt button on the VI, this does not close out the open RT sources. Leading to non-determisnistic RT behaviors, 

Therefore, if I reboot the Chassis everytime I run the system, it works fine => It has nothing to do with the code of the the algorithm.

 

I think the problem laid in the fact that I have 4 VIs: A, B, C, and D that were called inside of the Main VIs. How I do it is that I just drag A, B, C, D inside of Main and it automatically run when it get there. When I press stop button in either A, B, C, D, or Main, all 5 VIs will stop running.

 

Is it a correct way to use A, B, C, D inside of Main (A, B, C, D are NOT sub VI) as the image attached?

 

Do you recommend any way to close the VIs AND SubVIs effectively?

Or maybe I will open another thread.

 

Thank you for your help,

 

Mindy

0 Kudos
Message 10 of 16
(2,508 Views)