pem*_*ahl 8 python django subprocess
我有一个Python程序(确切地说,一个Django应用程序),它使用启动子进程subprocess.Popen
.由于我的应用程序的架构限制,我无法使用Popen.terminate()
终止子Popen.poll()
进程并检查进程何时终止.这是因为我无法在变量中保存对已启动子进程的引用.
相反,我必须在子进程启动时将进程id写入pid
文件pidfile
.当我想停止子进程时,我打开它pidfile
并用os.kill(pid, signal.SIGTERM)
它来阻止它.
我的问题是:如何确定子进程何时真正终止?使用signal.SIGTERM
它需要大约1-2分钟才能在通话后终止os.kill()
.首先,我认为这os.waitpid()
对于这项任务来说是正确的,但是当os.kill()
它给我之后我称之为它OSError: [Errno 10] No child processes
.
顺便说一句,我使用两种形式从HTML模板开始和停止子进程,程序逻辑在Django视图中.当我的应用程序处于调试模式时,异常会显示在我的浏览器中.知道我在view(python manage.py crawlwebpages
)中调用的子进程本身调用另一个子进程,即Scrapy爬虫的一个实例,这可能也很重要.我写了pid
这个Scrapy实例pidfile
,这就是我想要终止的.
这是相关代码:
def process_main_page_forms(request):
if request.method == 'POST':
if request.POST['form-type'] == u'webpage-crawler-form':
template_context = _crawl_webpage(request)
elif request.POST['form-type'] == u'stop-crawler-form':
template_context = _stop_crawler(request)
else:
template_context = {
'webpage_crawler_form': WebPageCrawlerForm(),
'stop_crawler_form': StopCrawlerForm()}
return render(request, 'main.html', template_context)
def _crawl_webpage(request):
webpage_crawler_form = WebPageCrawlerForm(request.POST)
if webpage_crawler_form.is_valid():
url_to_crawl = webpage_crawler_form.cleaned_data['url_to_crawl']
maximum_pages_to_crawl = webpage_crawler_form.cleaned_data['maximum_pages_to_crawl']
program = 'python manage.py crawlwebpages' + ' -n ' + str(maximum_pages_to_crawl) + ' ' + url_to_crawl
p = subprocess.Popen(program.split())
template_context = {
'webpage_crawler_form': webpage_crawler_form,
'stop_crawler_form': StopCrawlerForm()}
return template_context
def _stop_crawler(request):
stop_crawler_form = StopCrawlerForm(request.POST)
if stop_crawler_form.is_valid():
with open('scrapy_crawler_process.pid', 'rb') as pidfile:
process_id = int(pidfile.read().strip())
print 'PROCESS ID:', process_id
os.kill(process_id, signal.SIGTERM)
os.waitpid(process_id, os.WNOHANG) # This gives me the OSError
print 'Crawler process terminated!'
template_context = {
'webpage_crawler_form': WebPageCrawlerForm(),
'stop_crawler_form': stop_crawler_form}
return template_context
Run Code Online (Sandbox Code Playgroud)
我能做什么?非常感谢你!
编辑:
根据Jacek Konieczny给出的好答案,我可以通过将函数中的代码更改为以下内容来解决我的问题:_stop_crawler(request)
def _stop_crawler(request):
stop_crawler_form = StopCrawlerForm(request.POST)
if stop_crawler_form.is_valid():
with open('scrapy_crawler_process.pid', 'rb') as pidfile:
process_id = int(pidfile.read().strip())
# These are the essential lines
os.kill(process_id, signal.SIGTERM)
while True:
try:
time.sleep(10)
os.kill(process_id, 0)
except OSError:
break
print 'Crawler process terminated!'
template_context = {
'webpage_crawler_form': WebPageCrawlerForm(),
'stop_crawler_form': stop_crawler_form}
return template_context
Run Code Online (Sandbox Code Playgroud)
检查进程是否仍在运行的常用方法是使用信号'0'来kill()它.它对正在运行的作业没有任何作用,并且如果该进程不存在则引发OSError
异常errno=ESRCH
.
[jajcus@lolek ~]$ sleep 1000 &
[1] 2405
[jajcus@lolek ~]$ python
Python 2.7.3 (default, May 11 2012, 11:57:22)
[GCC 4.6.3 20120315 (release)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> os.kill(2405, 0)
>>> os.kill(2405, 15)
>>> os.kill(2405, 0)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
OSError: [Errno 3] No such process
Run Code Online (Sandbox Code Playgroud)
但是只要有可能,调用者应该保持被调用进程的父节点并使用wait()
函数族来处理它的终止.这就是Popen
对象的作用.