05-22-2018 04:50 AM
Hi, I wrote some C++ functions to read millions of points of data (double) into RAM. It works in C++, but in Labview, when trying to run the functions from the dll, the VI crashes when the number of points exceed 28 million. Seems to be a runtime issue. Is there a setting I can change somewhere to go around this problem? Computer has more than enough RAM to store the data, and Labview doesn't need to access the full set, just segments of it, so it should not be a problem. Have written functions to pass these small sets into Labview.
05-25-2018 10:23 AM
Without seeing the C++ functions and the configuration of the Call Library Function node, it's possible that LabVIEW is making copies of the data that might add up over time. 28 million points = 224 MB of space, if a copy is made somewhere, then you could potentially have an issue (especially if you're in 32-bit LabVIEW, which limits processes to 3 GB of memory).
Again, I think some additional information or example code is needed to help out, but you can use the Tools»Profile»Show Buffer Allocations to see where data copies may get created (Memory Management for Large Data Sets, for reference)
As an aside, is there a particular reason that you couldn't load the data into LabVIEW directly and manage the arrays using methods in the previous link?