Applause from Ludovic Benistant, Montana Low, and 60 others
Abhay Pawar Follow
Machine Learning @ Instacart. Blogger. Loves data. https://www.linkedin.com/in/abhayspawar/
Nov 5 · 7 min read
My secret sauce to be in top 2% of a kaggle
competition
Competing in kaggle competitions is fun and addictive! And over the
last couple of years, I developed some standard ways to explore
features and build better machine learning models. These simple, but
powerful techniques helped me get a top 2% rank in Instacart Market
Basket Analysis competition and I use them outside of kaggle as well.
So, let’s get right into it!
One of the most important aspects of building any supervised learning
model on numeric data is to understand the features well. Looking at
partial dependence plots of a model helps you understand how the
model’s output changes with any feature.
But, the problem with these plots is that they are created using a
trained model. If we could create these plots from train
model/ition/compet/top/2%/kaggle/plots/understand/learning/build/
model/ition/compet/top/2%/kaggle/plots/understand/learning/build/
-->