From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Python multiprocessing not working in labview

Solved!
Go to solution

Hello,

 

My LabVIEW application calls a python program that spawns multiple subprocesses to do a piece of computation. The application works fine when I run it through cmd (without LabVIEW integration). The below diagram illustrates the method running through command prompt (CMD shell).

 

Screen Shot 2019-09-21 at 10.18.32 PM.png

 

However, when I run the code with LabVIEW python node the subprocesses are not being created. The below diagram explains my program workflow through LabVIEW python node.

Screen Shot 2019-09-21 at 10.18.45 PM.png

 

So my question is, does LabVIEW python node supports python multiprocessing or do I need to create multiple python nodes to run multiple python applications?  

 

Any help would be appreciated.

Message 1 of 9
(3,866 Views)

I would suggest that probably the extra threads are being spawned but don't return their results, perhaps.

You could try modifying your Python code to include a gathering step to collect results and return them all at once.

 

Alternatively, if you prefer based on your application, you could do as you already suggested and make multiple calls to different sessions. Probably to do this you'd need asynchronously running VIs.


GCentral
0 Kudos
Message 2 of 9
(3,760 Views)

When I use multi-processing inside my application, the sub-processes are not even getting started. So I cannot gather and return the results. If the sub-processes start, I could do that. One thing that's not clear is, the same application works fine from the command prompt. But when I use it with labview the multi-processing part is not working.

 

Upon further analysis, I realized that the problem is with the LabVIEW pythoNode. Is not allowing my application to create multiple sub-processes.  So the question here is, does LabVIEW python node support creating multiple subprocesses in the first place?

 

Thanks in advance.

0 Kudos
Message 3 of 9
(3,738 Views)
Solution
Accepted by topic author sudheer_raja

A little research later...

I think if you take a look at the response on StackOverflow here: Embedded python multiprocessing not working you might find what you need.

 

I was able to get the following to produce the expected results using LabVIEW and appropriate path constants:

import multiprocessing as mp
import time, os, sys

def GetIDWithWait(delay, q):
	time.sleep(delay)
	if q:
		q.put(os.getpid())
	return os.getpid()

def parallelFn(targetN, delayTime, offsetTime):
	pL = list()
	qL = list()
	pids = list()
	sys.argv = [os.path.abspath(__file__)]
// This is the executable used by LabVIEW.
// If you set this to e.g. "python" then you'll get an error
// in LabVIEW, and reading the error body can give you the path you need... mp.set_executable(r"C:\...\Python\Python36-32\python.exe")
for i in range(targetN): q = mp.Queue() qL.append(q) p = mp.Process(target=GetIDWithWait, args=(delayTime,q)) pL.append(p) p.start() time.sleep(offsetTime) for i in range(targetN): id = qL[i].get() pids.append(id) pL[i].join() print(pids) return pids

GetPIDs.png

 

I didn't time this exactly, but I'd say it took about 5 seconds to run - 3 seconds + (0.5*4). It was certainly faster than 14 seconds (3.5*4).

 

 


GCentral
Message 4 of 9
(3,725 Views)

@cbutcher wrote:

https://forums.ni.com/t5/LabVIEW/Python-multiprocessing-not-working-in-labview/m-p/3972611#M1132388


Besides change the mp.set_executabl to the full "..AppData\Local\Programs\Python\Python36-32\python.exe",

I had also to change the comment sequence from // to #

to make pyExample.py execute via your python node example snippet.

 

@cbutcher wrote:

I didn't time this exactly, but I'd say it took about 5 seconds to run - 3 seconds + (0.5*4). It was certainly faster than 14 seconds (3.5*4).

 

do I get this right, if pyExample.py was execute in sequence it would take at least 14 seconds to excute, but when run in parellel ("multiprocessing") it takes about 5 seconds? It takes also about 5 seconds to execute on my machine:

5seconds.png

 

 

0 Kudos
Message 5 of 9
(3,713 Views)

@alexderjuengere wrote:

I had also to change the comment sequence from // to #

to make pyExample.py execute via your python node example snippet.


Urgh. This is what I get for adding comments on the forum and not testing again after I wrote the code (without comments). Foolish!


GCentral
0 Kudos
Message 6 of 9
(3,699 Views)

Thanks. I will try this approach and let you know.

0 Kudos
Message 7 of 9
(3,690 Views)

@cbutcher wrote:

@alexderjuengere wrote:

I had also to change the comment sequence from // to #

to make pyExample.py execute via your python node example snippet.


Urgh. This is what I get for adding comments on the forum and not testing again after I wrote the code (without comments). Foolish!


Smiley Wink some while ago, I made something similar

0 Kudos
Message 8 of 9
(3,665 Views)
Solution
Accepted by topic author sudheer_raja

Adding multiprocessing python executable solved the problem. I am summarizing the solution in the post for others.

 

The problem that I was having is, the multiprocessing feature in LabVIEW's embedding python was disabled by default. Because of that my application was unable to start and manage sub-processes. Explicitly mentioning the standard python interpreter in LabVIEW solved my problem. The way this can be done is by adding a multiprocessing executable and sys executable. Additionally, you need to enable sys arguments too (since argv is also unavailable in LabVIEW python by default).

 

I am adding a small code snippet for your reference (taken from @cbutcher's solution).

 

Before:

 

import multiprocessing as mp
import time, os, sys

def GetIDWithWait(delay, q):
	time.sleep(delay)
	if q:
	    q.put(os.getpid())
	return os.getpid()

def parallelFn(targetN, delayTime, offsetTime):
	pL = list()
	qL = list()
	pids = list()
	for i in range(targetN):
		q = mp.Queue()
		qL.append(q)
		p = mp.Process(target=GetIDWithWait, args=(delayTime,q))
		pL.append(p)
		p.start()
		time.sleep(offsetTime)
	for i in range(targetN):
		id = qL[i].get()
		pids.append(id)
		pL[i].join()
	print(pids)
	return pids

 

After:

 

import multiprocessing as mp
import time, os, sys

# This enables the multiprocessing for the parent main process
# Put your python environment's python.exe path here
sys.executable = r"C:\...\Python\Python36-32\python.exe" def GetIDWithWait(delay, q): time.sleep(delay) if q: q.put(os.getpid()) return os.getpid() def parallelFn(targetN, delayTime, offsetTime): pL = list() qL = list() pids = list() sys.argv = [os.path.abspath(__file__)]
# This enables the multiprocessing module for child sub-processes
# Put your python environment's python.exe path here mp.set_executable(r"C:\...\Python\Python36-32\python.exe")
for i in range(targetN): q = mp.Queue() qL.append(q) p = mp.Process(target=GetIDWithWait, args=(delayTime,q)) pL.append(p) p.start() time.sleep(offsetTime) for i in range(targetN): id = qL[i].get() pids.append(id) pL[i].join() print(pids) return pids

 

 

Message 9 of 9
(3,642 Views)