小编lar*_*off的帖子

将 XGBoost 与 dask 分布式一起使用时出现值类型错误

这是在我的机器上重现错误的代码:

import numpy as np
import xgboost as xgb
import dask.array as da
import dask.distributed
from dask_cuda import LocalCUDACluster
from dask.distributed import Client

X = da.from_array(np.random.randint(0,10,size=(10,10)))
Y = da.from_array(np.random.randint(0,10,size=(10,1)))

cluster = LocalCUDACluster(n_workers=4, threads_per_worker=1)
client = Client(cluster)

dtrain = xgb.dask.DaskDeviceQuantileDMatrix(client=client, data=X, label=Y)

params = {'tree_method':'gpu_hist','objective':'rank:pairwise','min_child_weight':1,'max_depth':3,'eta':0.1} 
watchlist = [(trainLong, 'train')] 
reg= xgb.dask.train(client, params, dtrain, num_boost_round=10,evals=watchlist,verbose_eval=1)
Run Code Online (Sandbox Code Playgroud)

这是错误的摘要:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-9-ff1b0329f2f9> in <module>
      1 params = {'tree_method':'gpu_hist','objective':'rank:pairwise','min_child_weight':1,'max_depth':3,'eta':0.1}
      2 watchlist = [(trainLong, 'train')]
----> 3 regLong = xgb.dask.train(client, params, trainLong, …
Run Code Online (Sandbox Code Playgroud)

python distributed gpu dask xgboost

6
推荐指数
0
解决办法
376
查看次数

标签 统计

dask ×1

distributed ×1

gpu ×1

python ×1

xgboost ×1