预测时Xgboost(GPU)崩溃
作者:互联网
我在Python中使用XGBoost GPU版本,每当我尝试运行.predict时,它就会崩溃.它适用于较小的数据集,但对于我当前的问题却不起作用.
train_final.shape, test_final.shape
((631761, 174), (421175, 174))
params = {
'objective': 'multi:softmax',
'eval_metric': 'mlogloss',
'eta': 0.1,
'max_depth': 6,
'nthread': 4,
'alpha':0,
'num_class': 5,
'random_state': 42,
'tree_method': 'gpu_hist',
'silent': True
}
GPU统计:GTX 1070,6 GB
内存:32 GB
有人可以帮我理解为什么会这样吗?
解决方法:
Saving the model, deleting the booster then loading the model again should achieve this.
# training
bst = xgb.train(param, dtrain, num_round)
#save model
joblib.dump(bst, 'xgb_model.dat')
bst.__del__()
#load saved model
bst = joblib.load('xgb_model.dat')
preds = bst.predict(dtest)
标签:xgboost,python,jupyter-notebook 来源: https://codeday.me/bug/20191013/1909643.html