Why is xgboost given so much less attention than deep learning despite its ubiquity in winning Kaggle solutions?
CRANK

Because Kaggle is not the end of the world!Deep learning methods require a lot more training data than XGBoost, SVM, AdaBoost, Random Forests etc. On the other hand, so far only deep learning methods have been able to "absorb" huge amounts of training data, without saturating in performance! So when you don't have large amounts of training examples: in the 100k or 1000k range, alternatives to deep learning methods are superior in performance (don't overfit, are often computationally superior to begin with). In some cases, you might even have big data available, but the function you are trying to model might be so simple, that a small subset of that data is enough to learn it - here computational considerations + experimental cycle times might cause you to prefer XGBoost etc.Secondly, designing a network that correctly encodes problem domain specific priors (such as respective deep convolutional network architectures for images or speech) is a challenging problem. And reasonable archit…

quora.com
Related Topics: Deep Learning