multithreading - Python's threads block on IO operation -
i have following problem. whenever child thread wants perform io operation (writing file, downloading file) program hangs. in following example program hangs on opener.retrieve
. if execute python main.py
program blocked on retrieve function. if execute python ./src/tmp.py
fine. don't understand why. can explain me happening?
i using python2.7 on linux system (kernel 3.5.0-27).
file ordering:
main.py ./src __init__.py tmp.py
main.py
import src.tmp
tmp.py
import threading import urllib class downloaderthread(threading.thread): def __init__(self, pool_sema, i): threading.thread.__init__(self) self.pool_sema = pool_sema self.daemon = true self.i = def run(self): try: opener = urllib.fancyurlopener({}) opener.retrieve("http://www.greenteapress.com/thinkpython/thinkcspy.pdf", "/tmp/" + str(self.i) + ".pdf") finally: self.pool_sema.release() class downloader(object): def __init__(self): maxthreads = 1 self.pool_sema = threading.boundedsemaphore(value=maxthreads) def download_folder(self): in xrange(20): self.pool_sema.acquire() print "downloading", t = downloaderthread(self.pool_sema,i) t.start() d = downloader() d.download_folder()
i managed work hacking urllib.py
- if inspect see many import
statements dispersed within code - i.e. uses imports stuff 'on fly' , not when module loads.
so, real reason still unknown - not worth investigating - deadlock in python's import system. shouldn't run nontrivial code during import
- that's asking trouble.
if insist, can work if move these weird import statements beginning of urllib.py.
Comments
Post a Comment