This document discusses and compares gradient boosted machines (GBM) and deep learning. It notes that while deep learning can give powerful abilities, GBM may better explain models through decision trees. GBM works by combining many weak decision tree models to more accurately classify data, like moving a heavy rock bit by bit. The document cites GBM's success in Kaggle competitions, with it comprising the majority of winning solutions, sometimes combined with neural networks. In summary, the document presents GBM and deep learning, highlighting GBM's interpretability while also noting deep learning's strengths, and examples of GBM's competitive performance.
1 of 38
Download to read offline
More Related Content
Machine Learning Wars: Deep Learning vs Gradient Boosting [Webinar]
36. XGBoost: A Scalable Tree Boosting System
https://www.kdd.org/kdd2016/papers/files/rfp0697-chenAemb.pdf
2015 KAGGLE GRAND PRIX AMONG 29 WINNING SOLUTIONS
Podium Ceremony
1
2 3
GBM
17 solutions
Deep Neural Nets
11 solutions 9 solutions
GBM + Neural Nets