11-11-2015 08:19 AM
I'm working on a private software where the key function is nonlinear fit.
Small intro: Basically, I have N matrices (images) and 2 unknown matrices of velocitiy fields (Vx, Vy). N(i) image comes from N(i-k) (k is integer) according to Vx, Vy matrices. By parameterizing Vx and Vy, having initial guess I run Lev-Mar Nonlinear Fit VI with a certain VI function. Everything works great, I have good solutiion for Vx and Vy, however, for good resolution (having more parameters of my Vx, Vy matrices) it takes quite decent time to calculate.
I'm not familiar with cuda programming but if the answer on the question is positive, I'll have to learn about it (at least in labview sense).
Question: Is it possible to integrate GPU computing for a such problem?
11-12-2015 09:06 AM
Given the information, the answer is likely yes but you have to understand the following conditions: