问题描述
我试图从最近三天开始在我的MacOS Mojave(10.14.6)上安装集成了GPU支持的xgboost,但是没有成功.我尝试了2种方法:
I am trying to install xgboost integrated with GPU support, on my MacOS Mojave(10.14.6) from last 3 days, however, no success has been reached. I tried 2 approaches:
- pip安装xgboost
xgboost已安装在此处,并且在没有GPU选项(即,没有tree_method ="gpu_hist")的情况下可以成功运行.
xgboost is installed here and it runs successfully without GPU option(i.e., without tree_method=’gpu_hist’).
我想通过在树形参数中输入"tree_method =‘gpu_hist’"来使用gpu_hist.当我在树形参数中输入"tree_method =" gpu_hist"时,出现以下错误:
I want to run with gpu_hist by giving "tree_method=’gpu_hist’ " in tree parameters. When I gave "tree_method=’gpu_hist’ " in tree parameters, following error has come:
XGBoostError: [12:10:34] /Users/travis/build/dmlc/xgboost/src/gbm/../common/common.h:153: XGBoost version not compiled with GPU support.
Stack trace:
[bt] (0) 1 libxgboost.dylib 0x000000012256ba60 dmlc::LogMessageFatal::~LogMessageFatal() + 112
[bt] (1) 2 libxgboost.dylib 0x00000001225f92b3 xgboost::gbm::GBTree::ConfigureUpdaters() + 531
[bt] (2) 3 libxgboost.dylib 0x00000001225f8b97 xgboost::gbm::GBTree::Configure(std::__1::vector<std::__1::pair<std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator >, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator > >, std::__1::allocator<std::__1::pair<std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator >, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator > > > > const&) + 967
[bt] (3) 4 libxgboost.dylib 0x0000000122611a0c xgboost::LearnerConfiguration::Configure() + 1500
[bt] (4) 5 libxgboost.dylib 0x0000000122611e68 xgboost::LearnerImpl::UpdateOneIter(int, std::__1::shared_ptr) + 120
[bt] (5) 6 libxgboost.dylib 0x000000012256331d XGBoosterUpdateOneIter + 157
[bt] (6) 7 libffi.7.dylib 0x0000000102102ead ffi_call_unix64 + 85
[bt] (7) 8 ??? 0x00007ffeee291da0 0x0 + 140732894092704
-
我的第二种方法:
My second approach:
git clone –recursive https://github.com/dmlc/xgboost cd xgboost/使-j4cd python软件包python3 setup.py install
git clone –recursive https://github.com/dmlc/xgboostcd xgboost/make -j4cd python-packagepython3 setup.py install
尽管它安装了xgboost,但是在运行以下语句时抛出以下错误:dtrain = xgb.DMatrix(df_train_features,label = df_train_label)#,missing = -999)
Though it installs xgboost, but throws following error whille running this statement:dtrain=xgb.DMatrix(df_train_features,label=df_train_label)#,missing=-999)
AttributeError:dlsym(0x7ffe9aed62f0,XGDMatrixSetDenseInfo):未找到符号
AttributeError: dlsym(0x7ffe9aed62f0, XGDMatrixSetDenseInfo): symbol not found
任何帮助将不胜感激
推荐答案
由于已过时 libxgboost.so
,在更新过程中未以某种方式更新时,我也遇到了同样的情况安装.
The same happened to me due to outdated libxgboost.so
that somehow wasn't updated during fresh install.
要解决此问题并在GPU支持下安装 xgboost
,您应该执行以下操作:
To resolve the issue and install xgboost
with GPU support you should do something like:
# remove libxgboost.so manually from where it resides
rm /home/$USER/anaconda3/lib/libxgboost.so
pip uninstall -y xgboost
git clone --recursive https://github.com/dmlc/xgboost
cd xgboost && mkdir build && cd build && cmake .. -DUSE_CUDA=ON
make -j12
cd ../python-package
python setup.py install
这篇关于如何在使用GPU支持编译的macOS上安装Xgboost?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!