简体   繁体   中英

xgboost in Python:where are the definitions of predictions, gradient, hessian and loss function?

I am searching for the code chunks with the definitions of

  • logistic loss
  • prediction
  • gradient
  • Hessian matrix

used by xgboost in its Python implementation available at https://github.com/dmlc/xgboost/tree/master/python-package/xgboost (*)

I can derive the above formulae analytically using the original paper by Chen and Guestrin (available at https://arxiv.org/abs/1603.02754 ), but I need a deep dive into code.

Can somebody point out the location of those definition in (*)? After a quick search I found nothing.

Those should be implement using C/C++ for performance so I don't think you will be able to find them under /python-package .

You should look into: https://github.com/dmlc/xgboost/tree/master/src .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM