GPU Computing

cancel
Showing results for 
Search instead for 
Did you mean: 

Least-squares fitting using GPU

Hi,

I am starting a project which involves least-squares fitting using Labview 2016.

It is important that I use GPU-enabled operations, since there are several million points that need to be fitted simultaneously.

(I record around 40 images, each with around (2000x2000) pixels, and then the consecutive images need to be fitted on a pixel-by-pixel basis).

I had the idea that I could use the toolkit LVCUDA, since this is basically a straightforward (but tedious) set of basic operations.

However, after looking through the documentation, I do not think I can compute Hadamard products (necessary to calculate x^2, x^3 etc).

So, I believe I need to go down the route suggested by Andrey back in 2009

https://decibel.ni.com/content/blogs/AndreyDmitriev/2009/04/09/using-nvidia-gpu-from-labview-with-cu...

So far, I can successfully query the GPU card, but do not have access to CVI, so am trying to proceed using Visual Studio.

Could anyone please give some advice on how to go about this?

As far as I can see, I need to write the program in C++, and build the dll library.

Thank you in advance!

Carl

0 Kudos
Message 1 of 2
(5,910 Views)

During my PhD, we followed the method of Andrey Dmitriev, and published our code here: https://engineering.ucsb.edu/~saleh/

If you have specific questions or get stuck, don't hesitate to ask. No guarantees that I'll be able to answer though. There is a lot of wrangling to get the Visual Studio stuff to work. I remember digging through guides of various types online. It might be quicker to just take our published code, delete all the functions, and just keep the project settings, to save time. That is, if you can get it to compile in the first place Then there will also be wrangling to make sure the DLL in Labview is expecting the right number and type of parameters.

Good luck!

0 Kudos
Message 2 of 2
(5,830 Views)