06-16-2012 07:15 AM
Hi all,
I'm using MATLAB in a relatively large LabVIEW application to do processor-intensive calculations (mainly higher-dimensional weighted curve fitting), since such functionality already exists in MATLAB and its numerics are faster than LabVIEW. However, the following behaviours about the MATLAB script node annoy me greatly and I want to know if there are workarounds:
1) If the MATLAB license server cannot be contacted, the VI will not load at all (repeated "cannot contact server" errors regardless if clicking cancel). This is extremely undesirable as that server is not maintained by us and goes down sometimes. This should not prevent the rest of the application from functioning let alone loading.
2) We have long scripts, several hundred lines, maintained by several people in parallel with LabVIEW development. Editing code within the node is untenable, so we have it in a .m file. This appears to break whatever error checking is performed between MATLAB and LabVIEW. At BEST errors (e.g. syntax) in the m-file produce "??? Cannot retrieve variable "P" from server" messages, with no information about the error that actually occurred, and at worst, no error is reported by LabVIEW, but the results of the PREVIOUS calculation is returned. This is disasterous for our applications.
3) LabVIEW is a slow application to start, as is MATLAB. It would be extremely helpful is some kind of loading screen could be presented when MATLAB is started, because at the moment it just seems to freeze when loading a VI. Even presenting "Starting MATLAB" or similar in that text space would be very helpful.
4) The MATLAB automation window is always visible, which is undesirable, especially since it can't be printed to
5) Arrays are limited to 2D. Yes we can reshape but the overhead is annoying, wastes memory, and seems pointless.
I attempted to create a workaround invoking the Feval call from MATLAB ActiveX to replace the script node, but this resulted in massive memory leaks with LabVIEW crashing quickly for being out of memory. It seems like the ActiveX call wasn't freeing memory somewhere.
The first two problems are almost show-stoppers and are causing murmurs amongst the hierarchy of scrapping LabVIEW altogether and rewriting in something else. I would very much like to prevent this, so suggestions are helpful. In particular regarding the error reporting.
Cheers,
Martijn
06-16-2012 11:55 AM
Sorry, I cannot help your problem, but some statements have peaked my interest
mjas wrote:I'm using MATLAB in a relatively large LabVIEW application to do processor-intensive calculations (mainly higher-dimensional weighted curve fitting), since such functionality already exists in MATLAB and its numerics are faster than LabVIEW.
I am doing a lot of fitting in LabVIEW and I am curious about your statement about speed of "numerics", whatever that is. Do you have some real benchmarks of identical algorithms (that are implemented correctly in LabVIEW)? What kind of multidimensional fitting are you doing? Obviously, you probably did not compare the actual fitting, because "it already exists" in matlab so I doubt you implemented it in LabVIEW. 😉
Sorry for the distraction, just curious. 😉
@mjas wrote:
... are causing murmurs amongst the hierarchy of scrapping LabVIEW altogether and rewriting in something else. I would very much like to prevent this, so suggestions are helpful.
I went the other way and scrapped matlab in favor of LabVIEW and never regretted it. 😄
06-16-2012 07:30 PM
Hi Martijn,
I can tell you what I have experienced combining MATLAB and LabVIEW. The MATLAB script node has helped me for prototyping applications (i.e. simulations created in MATLAB that I wanted to quickly test with real world measurements) but when it came to a "final product" I always opted for rewriting the algorithm in LabVIEW. The main reason for this, as you have pointed out, was the speed of execution.
I can think of 2 "workarounds" that would allow you not to use the MATLAB script node, but they may be a bit of a stretch and require additional NI and MATLAB toolkits. For both you will need to use an Embedded MATLAB function block in SIMULINK with your .m file code. First, use the NI LabVIEW Control Design and Simulation module to convert your Simulink diagram to LabVIEW (the module may be able to convert from an .m file but I don't remember). Second, use the NI LabVIEW Simulation Interface Toolkit to call a compiled Simulink DLL from LabVIEW. This last toolkit can be used in an RT target or your PC. I have used it a couple of times and, if you have the required software, it may be worth giving it a try.
I hope this helps a little.
Regards,
06-17-2012 04:24 AM
Yes I am aware that LabVIEW has a bad wrap for numerics which it doesn't deserve - for the record I usually argue against the haters (of which there are many at my workplace). But when it comes to large arrays, temporary copies and on-the-fly array creation seem to cause big slowdown. The most basic fit I need to do is 2D gaussian with offset to a 1920x1024 data array. The example code "Fit gaussian surface with offset.vi" seems to work fine on small arrays, but up at this size the program takes minutes to do a single iteration. I presume this example code is moderately efficient, but does not include axis rotation which would further slow down. When I attempted to implement a version myself I got "out of memory" errors (on 3GB ram machine) which I accept means it was not the "correct" implementation. If you have a way to fit such large arrays in an execution time similar to MATLAB I'm definitely interested, but dedicating time to investigate it myself will be deemed unacceptable, because the MATLAB solution works great - it's just that the LabView glue is terrible.
Thanks for your suggestion jarcTec, but we don't have Simulink so that's not an option for me. It is an interesting alternative, but it would probably be decided to go full pure-C DLL instead and drop the license dependence.
Originally LabView was pitched for this project because the numerics could be farmed out to existing debugged code, further reducing development time. So I find myself annoyed at the error-reporting in particular, which is LabView's responsibility. I recall reading complaints about the node on this forum before, with the official response being along the lines of "what's wrong with it?". So the points I list are my 2c worth.
06-17-2012 11:37 AM
@mjas wrote:
Yes I am aware that LabVIEW has a bad wrap for numerics which it doesn't deserve - for the record I usually argue against the haters (of which there are many at my workplace). But when it comes to large arrays, temporary copies and on-the-fly array creation seem to cause big slowdown. The most basic fit I need to do is 2D gaussian with offset to a 1920x1024 data array. The example code "Fit gaussian surface with offset.vi" seems to work fine on small arrays, but up at this size the program takes minutes to do a single iteration. I presume this example code is moderately efficient, but does not include axis rotation which would further slow down. When I attempted to implement a version myself I got "out of memory" errors (on 3GB ram machine) which I accept means it was not the "correct" implementation. If you have a way to fit such large arrays in an execution time similar to MATLAB I'm definitely interested, but dedicating time to investigate it myself will be deemed unacceptable, because the MATLAB solution works great - it's just that the LabView glue is terrible.
You did not really say how fast matlab is. Do you have some relative numbers?
I have some newer, much improved versions of this vi that also includes offset and it does about 6 seconds per iteration with an 1920x1024 input (including generating all numeric partial derivatives for 7 parameters: amplitude, x-pos, y-pos, width A, width B, angle, offset) on an almost 6 year old core2duo laptop. (one iteration requires 8 function calls here). I am sure it would run much faster on a modern 4 core desktop.
This is using brute force partial derivatives. With a bit more intelligence it could be sped up much more. For example the numerical partial derivative for the amplitude actually would not require a recalculation of the function as is currently done. Also the interactive display of the shippping example slows things down.
Note that you can use a parallel FOR loop and it will scale with the number of cores. Of course if the gaussian covers only a small fraction of the image, fitting on a small subset would be almost instantaneous.
(If speed is important, you should always do your own partial derivatives in the model VI. The numerical partial derivatives used by LabVIEW takes about twice as many function calls because it does the step in both directions. While this very accurately reproduces the NIST datasets, it is almost never needed for real data.)
I'll be busy for the next weeks, so I probably don't have time to work on this. Maybe we can start a new discussion elsewhere, because this thread is about the matlab node and associated problems.