For the task described below, please give advice on what parts should be performed in LabView, what parts should be performed in Matlab, and the best way to get the Matlab code running on LabView. I am a Matlab programmer with no experience in LabView or MathScript; I am working with a LabView programmer who hasn't integrated Matlab before. So your advice will help us figure out our approach. Here is the task:
How much of this task should be performed in LabView (just the file concatenation and writing the output file, or all of it, or ...?) What is the best way to pull Matlab into this (MathScript, or use Matlab Coder to generate a *.exe executable or C source code, or skip Matlab altogether?)
Because the code only needs to run once an hour, and because we are new to this, execution speed is less important than getting something running.
Solved! Go to Solution.
Frankly, I'd start by avoiding integration and keeping the LabVIEW and Matlab parts totally independent. That best suits your development skills with each of you familiar with exactly one of those environments. (Even more frankly, I'd personally do it all in LabVIEW, though I might still break it into independent processes. Depends whether there's value in maintaining a raw data archive with a standalone post-processing tool.)
I'd highly recommend that the LabVIEW acqusition and raw file writing code plan to structure the output files with separate folders for each 1-hour chunk of data to be processed. I'd further suggest that the writing code initially write to a temp folder, then move the folder into the proper location only after the full hour has passed.
Now the Matlab code can simply watch for a new folder to appear, do all its post-processing on the files in it, voila! And you now have a tool to enable re-analysis on the raw data in the future when someone is interested in other aspects, because that almost always happens.
I largely agree with Kevin (including the suggestion to "Do it all in LabVIEW"). However, instead of writing to a Temp File, you could use a unique naming pattern to distinguish "ten-minute" files from "concatenated hourly files" and keep them all together in the same folder. Get LabVIEW to do the concatenating (it is really simple to do this, and LabVIEW "knows" the file formats and, more important, "knows" when to concatenate, and can even do all of this in parallel with the task of continuing data acquisition and writing of the 10-minute files). Now MatLab just hangs around, checking every minute or 10 for a new Concatenated File, and processes it.
I should have added: we only have LabView not Matlab on the machine that is doing the data collection, and the LabView files are not on a network that is accessible to the computers I use Matlab on. GIven that, is there some way Matlab can do the processing, as you describe? It would be a relief.
Telepathy? Not having a way to get the data easily to a machine that can process it, and not having the two different Software systems (LabVIEW and MatLab) on the same computer, really does make it difficult, and really does suggest the "Do it all in LabVIEW" as the best answer.
Can you describe in reasonable detail just what kind of processing you need to do with this data (in MatLab)? Does it need to be done as the data are acquired, or can you use "SneakerNet" to collect the data on Monday, put it on the USB Key on Tuesday, load it on the MatLab machine on Wednesday, and process on Thursday? [You can probably speed this up a bit ...]
Note that LabVIEW supports (and encourages!) doing tasks in parallel, so it is entirely reasonable that as you collect the data, you can also be processing it, particularly if your data rates are "modest" (i.e. KHz rather than GHz).
As each hour of data becomes available, ideally we would start to process it then make a text file containing the output. We could accept ~1 hour of delay between availability of temperature data and saving the output file, but we are seeking 'sort of' real time output. Sneakers can't access the data either, unfortunately: LabView is collecting data on a physically remote system. 'All in Labview' may be the best bet...guess I will start reading up on 'Signal Processing in LabView for Dummies'!
Since you're going to save development time compared to the highly integrated combined LabVIEW and Matlab approach, spend a little side-by-side time with the LabVIEW programmer. He/She should be able to code up the Matlab algorithm you need if you're there to help explain it. And then you can learn a little about LabVIEW signal processing as the coding is being done.
From what you described, the signal processing sounds pretty straightforward, there shouldn't be any real issues doing it directly in LabVIEW. Just watch the units & scaling on the various spectral algorithms. If your Matlab code is confirmed correct, you can process the same dataset both ways to help confirm the LabVIEW implementation.
It didn't take our LabView programmer long to rewrite the Matlab code in LabView, and we have compared the outputs of the FFT component as suggested. Thanks for everyone's help.
Labview and Matlab have different approaches to parallel processing. LV enables multiple executables to run simutaneously, as part of the basic package. Matlab only permits one executable to run as part of the basic package - parallel processing is at additional costs. my current software runs several Labview apps simutaneoulsy, along with one Matlab app. This evolved over several years due to my programing skills and because somethings are much easier in Labview and others are much easier in Matlab.
Now, i would prefer to use just one package. Because i know which matlab functions i use, i should check to see if LabVIEW MathScript (http://www.ni.com/tutorial/14575/en/) contains all the necessary subroutines... Then i could easily consolidate.
If starting fresh, -i'd consider using only Labview because: (a) labview has LabVIEW MathScript, (b) *fantastic* interfacing with hardware AND (c) parallel processing as part of the basic package for executables.