PLAY PODCASTS
From SHAP to EBM: Explain your Gradient Boosting Models in Python (sps24)

From SHAP to EBM: Explain your Gradient Boosting Models in Python (sps24)

Chaos Computer Club - archive feed · Emanuele Fabbiani

October 18, 202435m 54s

Audio is streamed directly from the publisher (cdn.media.ccc.de) as published in their RSS feed. Play Podcasts does not host this file. Rights-holders can request removal through the copyright & takedown page.

Show Notes

XGBoost is considered a state-of-the-art model for regression, classification, and learning-to-rank problems on tabular data. Unfortunately, tree-based ensemble models are notoriously difficult to explain, limiting their application in critical fields. Techniques like SHapley Additive exPlanations (SHAP) and Explainable Boosting Machine (EBM) have become common methods for assessing how much each feature contributes to the model prediction. This talk will introduce SHAP and EBM, explaining the theory behind their mechanisms in an accessible way and discussing the pros and cons of both techniques. We will also comment on Python snippets where SHAP and EBM are used to explain a gradient boosting model. Attendees will walk away with an understanding of how SHAP and EBM work, the limitations and merits of both techniques, and a tutorial on how to use these methods in Python, courtesy of the "shap" and "interpret-ml" packages. about this event: https://c3voc.de

Topics

56262importData Science & MoreAula2024