本文介绍了xgb.train和xgb.XGBRegressor(或xgb.XGBClassifier)之间有什么区别?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经知道"xgboost.XGBRegressor是XGBoost的Scikit-Learn Wrapper界面."

I already know "xgboost.XGBRegressor is a Scikit-Learn Wrapper interface for XGBoost."

但是它们还有其他区别吗?

But do they have any other difference?

推荐答案

xgboost.train 是用于通过梯度增强方法训练模型的低级API.

xgboost.train is the low-level API to train the model via gradient boosting method.

xgboost.XGBRegressorxgboost.XGBClassifier是准备DMatrix并传递相应目标函数和参数的包装器(如他们称其为"Scikit-Learn-like包装器").最后,fit调用简单地归结为:

xgboost.XGBRegressor and xgboost.XGBClassifier are the wrappers (Scikit-Learn-like wrappers, as they call it) that prepare the DMatrix and pass in the corresponding objective function and parameters. In the end, the fit call simply boils down to:

self._Booster = train(params, dmatrix,
                      self.n_estimators, evals=evals,
                      early_stopping_rounds=early_stopping_rounds,
                      evals_result=evals_result, obj=obj, feval=feval,
                      verbose_eval=verbose)

这意味着 XGBRegressorXGBClassifier可以完成的所有操作都可以通过基础xgboost.train函数实现.另一种方法显然是不正确的,例如,XGBModel API不支持某些xgboost.train有用的参数.显着差异的列表包括:

This means that everything that can be done with XGBRegressor and XGBClassifier is doable via underlying xgboost.train function. The other way around it's obviously not true, for instance, some useful parameters of xgboost.train are not supported in XGBModel API. The list of notable differences includes:

  • xgboost.train允许设置在每次迭代结束时应用的callbacks.
  • xgboost.train允许通过xgb_model参数继续训练.
  • xgboost.train不仅允许最小化eval函数,而且还可以最大化.
  • xgboost.train allows to set the callbacks applied at end of each iteration.
  • xgboost.train allows training continuation via xgb_model parameter.
  • xgboost.train allows not only minization of the eval function, but maximization as well.

这篇关于xgb.train和xgb.XGBRegressor(或xgb.XGBClassifier)之间有什么区别?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-13 19:29