How to run multiple bash scripts from python master script in parallel? -


i have series of time consuming independent bash scripts run in parallel on 7 cpu cores python master script. tried implement using multiprocessing.pool.map() function iterating on numbers xrange(1, 300) sequence, every number used define name of directory containing bash script executed. issue following script spawns 7 processes bash script runs , finishes right after completed.

import multiprocessing import os a= os.getcwd() #gets current path def run(x):             b = + '/' + 'dir%r' % (x) # appends name of targeted folder path             os.chdir(b) #switches targeted directory             os.system('chmod +x run.sh')             os.system('./run.sh') # runs time consuming script if __name__ == "__main__":     procs = 7     p = multiprocessing.pool(procs)     p.map(run, xrange(1, 300))     print "====done====" 

i expect other 292 shell scripts run well, fix or alternative implementation me? thank you!


Comments

Popular posts from this blog

unity3d - Rotate an object to face an opposite direction -

angular - Is it possible to get native element for formControl? -

javascript - Why jQuery Select box change event is now working? -