From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW Embedded

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW Embedded - Performance Testing - Different Platforms

Hi all,

 

I've done some performance testing of LabVIEW on various microcontroller development boards (LabVIEW Embedded for ARM) as well as on a cRIO 9122 Real-time Controller (LabVIEW Real-time) and a Dell Optiplex 790 (LabVIEW desktop). You may find the results interesting. The full report is attached and the final page of the report is reproduced below.

 

Test Summary

 

 

 

µC MIPS

Single Loop

Effective MIPS

Single Loop

Efficiency

Dual Loop

Effective MIPS

Dual Loop

Efficiency

MCB2300

  65

    31.8

49%

      4.1

  6%

LM3S8962

  60

    50.0

83%

      9.5

16%

LPC1788

  120

    80.9

56%

    12.0

  8%

cRIO 9122

  760

  152.4

20%

  223.0

29%

Optiplex 790

6114

5533.7

91%

5655.0

92%

 

Analysis

 

For microcontrollers, single loop programming can retain almost 100% of the processing power. Such programming would require that all I/O is non-blocking as well as use of interrupts. Multiple loop programming is not recommended, except for simple applications running at loop rates less than 200 Hz, since the vast majority of the processing power is taken by LabVIEW/OS overhead.

 

For cRIO, there is much more processing power available, however approximately 70 to 80% of it is lost to LabVIEW/OS overhead. The end result is that what can be achieved is limiting.

 

For the Desktop, we get the best of both worlds; extraordinary processing power and high efficiency.

 

Speculation on why LabVIEW Embedded for ARM and LabVIEW Real-time performance is so poor puts the blame on excessive context switch. Each context switch typically takes 150 to 200 machine cycles and these appear to be inserted for each loop iteration. This means that tight loops (fast with not much computation) consume enormous amounts of processing power. If this is the case, an option to force a context switch every Nth loop iteration would be useful.

 

Conclusion

 

 

LabVIEW Embedded

for ARM

LabVIEW Real-time for cRIO/sbRIO

LabVIEW Desktop for Windows

Development Environment Cost

High

Reasonable

Reasonable

Execution Platform Cost

Very low

Very High / High

Low

Processing Power

Low (current Tier 1)

Medium

Enormous

LabVIEW/OS efficiency

Low

Low

High

OEM friendly

Yes+

No

Yes

 

LabVIEW Desktop has many attractive features. This explain why LabVIEW Desktop is so successful and is the vast majority of National Instruments’ software sales (and consequently results in the vast majority of hardware sales). It is National Instruments’ flagship product and is the precursor to the other LabVIEW offerings. The execution platform is powerful, available in various form factors from various sources and is competitively priced.

 

LabVIEW Real-time on a cRIO/sb-RIO is a lot less attractive. To make this platform attractive the execution platform cost needs to be vastly decreased while increasing the raw processing power. It would also be beneficial to examine why the LabVIEW/OS overhead is so high. A single plug-in board no larger than 75 x 50 mm (3” x 2”) with a single unit price under $180 would certainly make the sb-RIO a viable execution platform. The peripheral connectors would not be part of the board and would be accessible via a connector. A developer mother board could house the various connectors, but these are not needed when incorporated into the final product. The recently released Xilinx Zynq would be a great chip to use ($15 in volume, 2 x ARM Cortex A9 at 800 MHz (4,000 MIPS), FPGA fabric and lots more).

 

LabVIEW Embedded for ARM is very OEM friendly with development boards that are open source with circuit diagrams available. To make this platform attractive, new more capable Tier 1 boards will need to be introduced, mainly to counter the large LabVIEW/OS overhead. As before, these target boards would come from microcontroller manufacturers, thereby making them inexpensive and open source. It would also be beneficial to examine why the LabVIEW/OS overhead is so high. What is required now is another Tier 1 boards (eg. DK-LM3S9D96 (ARM Cortex M3 80 MHz/96 MIPS)). Further Tier 1 boards should be targeted every two years (eg. BeagleBoard-xM (ARM Cortex A8 1000 MHz/2000 MIPS board)) to keep LabVIEW Embedded for ARM relevant.

Message 1 of 7
(11,011 Views)

Very interesting post. I had been wondering about this subject as well, because I'd been looking at the ARM toolkit. Questions to applications engineers about performance, code size etc. were mostly met with what I call "non-answer-answers". "It depends on you application". It sure does Smiley LOL!! 

0 Kudos
Message 2 of 7
(10,948 Views)

I've got to say though, it would really be good if NI could further develop the ARM embedded toolkit.

In the industry I'm in, and probably many others, control algorithm development and testing oocurs in labview. If you have a good LV developer or team, you'll end up with fairly solid, stable and tested code. But what happens now, once the concept is validated, is that all this is thrown away and the C programmers create the embedded code that will go into the real product.

The development cycle starts from scratch. 

 

It would be amazing if you could strip down that code and deploy it onto ARM and expect it to not be too inefficient. Development costs and time to market go way down.. BUT, but especially in the industry I presently work in, the final product's COST is extremely important. (These being consumer products, chaper micro cheaper product) . 

These concerns weight HEAVILY. I didn't get a warm fuzzy about the ARM toolkit for my application. I'm sure it's got its niches, but just imagine what could happen if some more work went into it to make it truly appealing to wider market...

 

 

 

0 Kudos
Message 3 of 7
(10,944 Views)

Perhaps the best way to go is the way LabVIEW Interface for Arduino does it. Use LabVIEW on the desktop where processing resources are plentiful and C for the microcontroller where resource utilisation is more important than programming effort.

0 Kudos
Message 4 of 7
(10,930 Views)

I have played a little bit with LabView for Arduino.  Not sure if that would be a good solution for final product.  LabView Interface for Arduino (LIFA) is not an official NI product, so we should only expect community support.  And Arduino hardware itself is an open-source platform, I would be concern about their software life cycle and support.

 

On the functional side, I think LIFA makes Arduino board behave like a DAQ device in LabView.  If that is your intention, perhaps it would be much easier simply getting one of the USB DAQ from NI.  LabView Embedded for ARM definitely provides more microcontroller functionality than LIFA.  If you need more processing power, you can always offload some of that to a desktop.

0 Kudos
Message 5 of 7
(10,912 Views)

We have to be careful to identify what is being discussed here, which is for the embedded environment, what is a good programming language. The Arduino example shows that C is a good candidate. (Not what is a good DAQ.)

 

In summary, what is a good programming language for embedded applications? We need to take into consideration ease of programming, processing power efficiency, community support, range of targets, open software, open networking, ruggedness, reliability, form factor, price, power consumption, etc.

 

At the moment, for me, LabVIEW Desktop looks like a good fit for desktop computer programming and C looks like a good fit for microcontroller programming. This could easily change as the years unfold. 

0 Kudos
Message 6 of 7
(10,905 Views)

Correction: The cRIO controller that was benchmarked is the cRIO-9012.

0 Kudos
Message 7 of 7
(9,136 Views)