我希望使用GBM包进行逻辑回归,但是它给出的答案略微超出0-1范围。我已经尝试了针对0-1预测的建议分布参数(bernoulliadaboost),但实际上,这比使用gaussian更糟。

GBM_NTREES = 150
GBM_SHRINKAGE = 0.1
GBM_DEPTH = 4
GBM_MINOBS = 50
> GBM_model <- gbm.fit(
+ x = trainDescr
+ ,y = trainClass
+ ,distribution = "gaussian"
+ ,n.trees = GBM_NTREES
+ ,shrinkage = GBM_SHRINKAGE
+ ,interaction.depth = GBM_DEPTH
+ ,n.minobsinnode = GBM_MINOBS
+ ,verbose = TRUE)
Iter   TrainDeviance   ValidDeviance   StepSize   Improve
     1        0.0603             nan     0.1000    0.0019
     2        0.0588             nan     0.1000    0.0016
     3        0.0575             nan     0.1000    0.0013
     4        0.0563             nan     0.1000    0.0011
     5        0.0553             nan     0.1000    0.0010
     6        0.0546             nan     0.1000    0.0008
     7        0.0539             nan     0.1000    0.0007
     8        0.0533             nan     0.1000    0.0006
     9        0.0528             nan     0.1000    0.0005
    10        0.0524             nan     0.1000    0.0004
   100        0.0484             nan     0.1000    0.0000
   150        0.0481             nan     0.1000   -0.0000
> prediction <- predict.gbm(object = GBM_model
+ ,newdata = testDescr
+ ,GBM_NTREES)
> hist(prediction)
> range(prediction)
[1] -0.02945224  1.00706700

伯努利:
GBM_model <- gbm.fit(
x = trainDescr
,y = trainClass
,distribution = "bernoulli"
,n.trees = GBM_NTREES
,shrinkage = GBM_SHRINKAGE
,interaction.depth = GBM_DEPTH
,n.minobsinnode = GBM_MINOBS
,verbose = TRUE)
prediction <- predict.gbm(object = GBM_model
+ ,newdata = testDescr
+ ,GBM_NTREES)
> hist(prediction)
> range(prediction)
[1] -4.699324  3.043440

和adaboost:
GBM_model <- gbm.fit(
x = trainDescr
,y = trainClass
,distribution = "adaboost"
,n.trees = GBM_NTREES
,shrinkage = GBM_SHRINKAGE
,interaction.depth = GBM_DEPTH
,n.minobsinnode = GBM_MINOBS
,verbose = TRUE)
> prediction <- predict.gbm(object = GBM_model
+ ,newdata = testDescr
+ ,GBM_NTREES)
> hist(prediction)
> range(prediction)
[1] -3.0374228  0.9323279

我是在做错什么吗?我需要预处理(缩放,居中)数据吗?还是需要使用类似以下方法手动设置/设置值的上限:
prediction <- ifelse(prediction < 0, 0, prediction)
prediction <- ifelse(prediction > 1, 1, prediction)

最佳答案

?predict.gbm:



因此,如果使用distribution="bernoulli",则需要转换预测值以将其重新缩放为[0,1]:p <- plogis(predict.gbm(model))。使用distribution="gaussian"确实是用于回归而非分类的回归,尽管我很惊讶预测不在[0,1]中:我的理解是gbm仍然基于树,因此预测值不应该能够超出训练数据中显示的值。

关于R gbm Logistic回归,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/8410846/

10-12 22:51