My Data Science Journey


Projects, descriptions, tutorials, my journey

Instructor recommendation pays off!

Volunteer!


A New Beginning (hopefully)

The last five months have been some of the most challenging and yet most fullfilling of my educational experiences and I have three advanced degrees. The journey to a career in data science is only partially complete. The framework has been set and I’m ready to move in. Flatiron has provided a solid program, fantastic instruction with encouraging and exciting people.


Found Something

Models are funny things. You can use all manner of calculus, statistics, and even linear algebra to shape and reshape data into something unrecognizable but still have a way to make predictions with certain accuracies. In an effort to make sense of the model I ran across something even my instructor had not seen. SHAP (SHapley Additive exPlanations) is a game theoretic approach to explain the output of any machine learning model. It can also break down a prediction to show the impact of each feature. This leads to some amazing visualizations. After this boot camp, I will need to explore modeling with SHAP.


Feature Importance

While working with XGBoost in my customer churn model I found myself exploring feature importance. Something didn’t add up and I was having trouble explaining why. That’s when I stumbled across this article: Interpretable Machine Learning with XGBoost


Did Something

I wasn’t happy with my project. Everyone else was able to use machine learning with their project. My project didn’t lend itself to a good machine learning model but had some fantastic use of graphics and EDA. As much as I love creating graphical representations of my data, I wanted to use my newly learned skills. So…I changed my project. It wasn’t perfect, but was tremendously satisfying. I finished it in two and a half days!