Ensembling is one of the hottest techniques in today’s predictive analytics competitions. Every single recent winner of Kaggle.com and KDD competitions used an ensemble technique, including famous algorithms such as XGBoost and Random Forest.
Are these competition victories paving the way for widespread organizational implementation of these techniques? This session will provide a detailed overview of ensemble models, their origin, and show why they are so effective. We will explain the building blocks of virtually all ensembles techniques, to include bagging and boosting.
What You Will Learn:
View the Adept Events calendar