From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
11-23-2020 06:13 PM
Hi, Everyone:
I was hoping you could help me out here. I have to instances of SystemExec in parallel. They run in parallel from source, but run serially in a dll. Any ideas?
Solved! Go to Solution.
11-24-2020 03:17 AM - edited 11-24-2020 03:21 AM
Your caller only can use one thread to call a single C function (which a DLL function is in principle) and therefore everything has to be serialized. The LabVIEW runtime executing that VI inside the DLL could attempt to parallelize it anyways by spinning up extra threads but that comes with a lot of effort, is slow and nullifies in many cases a lot of the performance gain. Add to that extra overhead to synchronize all threads at the end before returning and managing them across multiple calls and things get a nightmare to implement and test.
LabVIEWs multithreading is not dynamic. The execution system starts up a number of threads on starting and then distributes the code chunks across them during execution. Starting up snd tearing down threads is relatively expensive, so not something you want to do continuously. But a DLL boundary is pretty much a black box. The caller can assume pretty much nothing about the DLL and vice versa beyond the declared interface.
While LabVIEW as caller can detect if a DLL is also a LabVIEW DLL and executes it in its own runtime system if the version matches instead of having the DLL startup its own runtime system and then having to marshall data between the two, that’s a convenience shortcut for debugging. After all where is the real benefit of creating a DLL from LabVIEW code to then call it from LabVIEW? Even with that, scheduling threads across DLL boundaries is not something you want to do since the caller has no control of what happens inside the DLL.
11-25-2020 12:04 PM
@rolfk wrote:
Your caller only can use one thread to call a single C function (which a DLL function is in principle) and therefore everything has to be serialized. The LabVIEW runtime executing that VI inside the DLL could attempt to parallelize it anyways by spinning up extra threads but that comes with a lot of effort, is slow and nullifies in many cases a lot of the performance gain. Add to that extra overhead to synchronize all threads at the end before returning and managing them across multiple calls and things get a nightmare to implement and test.
LabVIEWs multithreading is not dynamic. The execution system starts up a number of threads on starting and then distributes the code chunks across them during execution. Starting up snd tearing down threads is relatively expensive, so not something you want to do continuously. But a DLL boundary is pretty much a black box. The caller can assume pretty much nothing about the DLL and vice versa beyond the declared interface.
While LabVIEW as caller can detect if a DLL is also a LabVIEW DLL and executes it in its own runtime system if the version matches instead of having the DLL startup its own runtime system and then having to marshall data between the two, that’s a convenience shortcut for debugging. After all where is the real benefit of creating a DLL from LabVIEW code to then call it from LabVIEW? Even with that, scheduling threads across DLL boundaries is not something you want to do since the caller has no control of what happens inside the DLL.
Ah, the dll expert speaks! Good info, and it does explain what I am seeing. FYI, I didn't state how exactly the dll was used. It is actually being called by a test executive (but not TestStand).
Thank you!