No. 37 (254), issue 10Pages 82 - 89


P.N. Druzhkov, N.Yu. Zolotykh, A.N. Polovinkin
Several variations of parallel implementations of one of the supervised learning algorithms, Gradient Boosting Trees (GBT), with the use of Intel Threading Building Blocks are described. Results of experimental comparison and performance analysis of different approaches to parallelization are discussed.
Full text
gradient boosting trees, Intel Threading Building Blocks.
1. Breiman L. Random Forests. Machine Learning, 2001, v. 45, no. 1, pp. 5 - 32.
2. Breiman L., Friedman J., Olshen R., Stone C. Classification and Regression Trees. Wadsworth, 1983.
3. Breiman L. Bagging predictors Machine Learning, 1996, v. 26, no. 2, pp. 123 - 140.
4. Enzweiler M., Gavrila D.M. Monocular Pedestrian Detection: Survey and Experiments. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2009, v.31, no. 12, pp. 2179 - 2195.
5. Freund Y., Schapire R. Experiments with a New Boosting Algorithm. Machine Learning: Proceedings of the Thirteenth International Conference. San Francisco, Morgan Kauffman, 1996, pp. 148 - 156.
6. Friedman J.H. Greedy Function Approximation: a Gradient Boosting Machine. Technical report. Dept. of Statistics, Stanford University, 1999.
7. Friedman J.H. Stochastic Gradient Boosting. Technical report. Dept. of Statistics, Stanford University, 1999.
8. Geurts P., Ernst D., Wehenkel L. Extremely Randomized Trees. Machine Learning, 2006, v. 36, no.1, pp. 3 - 42.
9. Hastie T., Tibshirani R., Friedman J. The Elements of Statistical Learning. Springer-Verlag, 2008.
10. Intel Threading Building Blocks. Available at: (accessed 07.12.2010).
11. OpenCV Wiki. Available at: (accessed 07.12.2010).
12. UCI Machine Learning Repository. Available at: (accessed 07.12.2010).
13. Vapnik V. Estimation of dependences based on empirical data. Springer, 1981.