小编ang*_*lly的帖子

Docker内部的多处理python程序

我正在尝试在docker容器中测试python的多处理功能,但是即使成功创建了进程(我有8个CPU并创建了8个进程),它们也始终只占用一个物理CPU。这是我的代码:

from sklearn.externals.joblib.parallel import Parallel, delayed
import multiprocessing
import pandas
import numpy
from scipy.stats import linregress
import random
import logging

def applyParallel(dfGrouped, func):
    retLst = Parallel(n_jobs=multiprocessing.cpu_count())(delayed(func)(group) for name, group in dfGrouped)
    return pandas.concat(retLst)

def compute_regression(df):
    result = {}

    (slope,intercept,rvalue,pvalue,stderr) = linregress(df.date,df.value)
    result["slope"] = [slope]
    result["intercept"] = [intercept]

    return pandas.DataFrame(result)

if __name__ == '__main__':
    logging.basicConfig(level=logging.DEBUG,
                    format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
    logging.info("start")
    random_list = []
    for i in range(1,10000):
        for j in range(1,100):
            random_list.append({"id":i,"date":j,"value":random.random()})

    df = pandas.DataFrame(random_list)

    df = applyParallel(df.groupby('id'), …
Run Code Online (Sandbox Code Playgroud)

python docker python-multiprocessing

6
推荐指数
2
解决办法
4957
查看次数

标签 统计

docker ×1

python ×1

python-multiprocessing ×1