XGBoost小记

Posted 0xcafe

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了XGBoost小记相关的知识,希望对你有一定的参考价值。

1.原理

//TODO

2.Python Package Scikit-Learn API

2.1输入

数据的特征分为两类,一类是连续型,比如:体重,一种是分类型,比如性别。

在scikit-learn中的Glossary of Common Terms and API Elements有这么一段话:

Categorical Feature

A categorical or nominal feature is one that has a finite set of discrete values across the population of data. These are commonly represented as columns of integers or strings. Strings will be rejected by most scikit-learn estimators, and integers will be treated as ordinal or count-valued. For the use with most estimators, categorical variables should be one-hot encoded. Notable exceptions include tree-based models such as random forests and gradient boosting models that often work better and faster with integer-coded categorical variables. OrdinalEncoder helps encoding string-valued categorical features as ordinal integers, and OneHotEncoder can be used to one-hot encode categorical features. See also Encoding categorical features and the http://contrib.scikit-learn.org/categorical-encoding package for tools related to encoding categorical features.

大意是在利用基于树的模型训练时推荐使用数值编码而不是one-hot编码。

详情:https://scikit-learn.org/stable/glossary.html#glossary

2.2输出

在这里只说两点:multi:softmax和multi:softprob,官方文档是这么说的:

multi:softmax: set XGBoost to do multiclass classification using the softmax objective, you also need to set num_class(number of classes)
multi:softprob: same as softmax, but output a vector of ndata * nclass, which can be further reshaped to ndata * nclass matrix. The result contains predicted probability of each data point belonging to each class.

在这里略坑,建立model时无论填哪一个,在model fit之后,打印模型时参数却都是multi:softprob,但是predict的结果也和上述解释也不一致,结果是multi:softmax的结果,只有预测的标签,没有概率分布。

官方代码如下:可见num_class也是不用设置的,objective被强制替换成了multi:softprob.最后若想输出概率分布请用predict_proba函数来预测.

self.classes_ = np.unique(y)
self.n_classes_ = len(self.classes_)

if self.n_classes_ > 2:
            # Switch to using a multiclass objective in the underlying XGB instance
            xgb_options["objective"] = "multi:softprob"
            xgb_options[num_class] = self.n_classes_

3.DEMO

//TODO

以上是关于XGBoost小记的主要内容,如果未能解决你的问题,请参考以下文章

xgboost回归代码及lgb参数说明

XGBoost参数调优完全指南(附Python代码)

xgboost auc值怎么判断

xgboost实例代码

xgboost 不平衡样本的输出结果怎样划分

xgboost怎么实现模型融合