From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW FPGA implementation of optimal control algorithms

Solved!
Go to solution

Hello all!

 

I ordered a cRIO-9054 to use it as an in-vehicle controller by embedding my control algorithms. I am kind of a newbie in LabVIEW and have been using MATLAB/Simulink for a really long time. However, I successfully designed my control algorithm from scratch in LabVIEW after designing it in MATLAB (without using any of the converters), which includes quadratic programming. The algorithm works same in both environments but I am not sure about the implementation of Quadratic Programming VI, which I use in LabVIEW model, at the FPGA that cRIO has. So, my questions are as follows.

 

1)  Is it possible to use Quadratic Programming VI in LabVIEW FPGA and embed the algorithm to the FPGA of cRIO hardware?

2)  Is there any other way to achieve my goal? (maybe with MathScript module but as far as I know one cannot use that in LabVIEW FPGA) 

3)  If nothing else works, is it a good idea to run every part of my control algorithm in FPGA and use the optimization solver part in the processor that cRIO has? 

 

Thanks a lot for all your responses in advance.

Cheers,

Sertan 

0 Kudos
Message 1 of 6
(1,061 Views)

Hi Sertan,

 

welcome to the LabVIEW forum!

 


@sir_b wrote:

1)  Is it possible to use Quadratic Programming VI in LabVIEW FPGA and embed the algorithm to the FPGA of cRIO hardware?

2)  Is there any other way to achieve my goal? (maybe with MathScript module but as far as I know one cannot use that in LabVIEW FPGA) 

3)  If nothing else works, is it a good idea to run every part of my control algorithm in FPGA and use the optimization solver part in the processor that cRIO has? 


I would start with implementing the quadratic solver in the RT part of your target.

Is there any requirement to put the solver into the FPGA?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 2 of 6
(1,034 Views)

Hi GerdW,

 

Thanks for the response! 

 

I assume it would be better and faster to do all of the calculations in one place.

 

Communication between the RT part and the FPGA will possibly cause a delay (even it will be at a super low rate), which will affect the calculation speed. Also, I might use the remaining RT part memory in the near future for other applications by adding new modules to the device. Hence, implementing optimization solvers in the FPGA seems to have nice advantages. 

 

However, non of the above statements are requirement for now. Hence, you advice is definitely a great starting point. Thanks a lot for that!

0 Kudos
Message 3 of 6
(1,013 Views)
Solution
Accepted by topic author sir_b

Well, there are different forms of fast.

 

Performance vice you may be correct that doing everything in FPGA is the best way.

 

But there are other constraints you have to consider!

 

Developing in FPGA is much more effort, and so is debugging. Every little bug you find, means you have to fix it, recompile, wait (and wait some more) on the compilation to finish, then deploy everything and start debugging again. Debugging is not directly on the bare hardware but on the representation of it in your diagram and usually can't be done in realtime.

 

And complex algorithmes gobble FPGA slices like a hungry body builder eats hamburgers. There are always to few of them! 😁

 

So my approach is to do the really time critical things and those that are easily implemented in FPGA on FPGA and the rest in RT. When there is still FPGA space available and time in the project (I never seem to have time left at the end 😁) I consider moving more of the processing into the FPGA. But start small and modest and move up from there. Don't start with this attitude of everything should be done in FPGA. Especially if it involves any floating point or complex mathematic algorithms!

Rolf Kalbermatter
My Blog
Message 4 of 6
(1,006 Views)

Hi Sertan,

 


@sir_b wrote:

I assume it would be better and faster to do all of the calculations in one place.


Well, it can be faster in the FPGA, but it's not "better" by default.

You need to define the metrics of "better" for this special use case…

 


@sir_b wrote:

Communication between the RT part and the FPGA will possibly cause a delay (even it will be at a super low rate), which will affect the calculation speed. Also, I might use the remaining RT part memory in the near future for other applications by adding new modules to the device. Hence, implementing optimization solvers in the FPGA seems to have nice advantages. 


  • Yes, there will be some delay between FPGA and RT, but that's why I asked for "requirements".
  • In my use cases I only had to watch out for "remaining RT memory" usage with those very old 907x devices, when they only had 64MB of RAM. All "modern" cRIOs come with enough memory. (Again: for my use cases!)
  • Adding modules to your cRIO does not affect "RT memory"…
  • Which "advantages" to you expect when implementing those solvers in the FPGA? In my use cases I rather need the FPGA to handle all the modules/IO and leave the algorithms/logic to the RT part…
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 5 of 6
(1,003 Views)

Hi Rolf,

 

Thank you for your explanatory response about how things should be done at the beginning stage of the development! I will take all of your your advices about the FPGA development, will start modest and then move up 🙂 

 

Best,

Sertan

0 Kudos
Message 6 of 6
(925 Views)