Feature Importance Analyzer
by Community · open-source · Last verified 2026-03-17
Computes and visualizes feature importance using SHAP (TreeExplainer, KernelExplainer), permutation importance, and Boruta for any scikit-learn-compatible model. Generates HTML dashboards with global/local explanations, feature interaction heatmaps, and ranked importance tables exportable to CSV.
https://github.com/slundberg/shap ↗B
B—Above Average
Adoption: AQuality: AFreshness: ACitations: B+Engagement: F
Specifications
- License
- MIT
- Pricing
- open-source
- Capabilities
- shap-explanations, permutation-importance, boruta-selection, html-dashboard
- Integrations
- shap, scikit-learn, matplotlib, plotly
- Use Cases
- model-debugging, regulatory-reporting, feature-selection
- API Available
- No
- Language
- python
- Dependencies
- shap, scikit-learn, plotly, pandas, boruta
- Environment
- Python 3.10+
- Est. Runtime
- 2-15 minutes
- Tags
- feature-importance, shap, permutation-importance, explainability, xai
- Added
- 2026-03-17
- Completeness
- 100%
Index Score
66.9Adoption
80
Quality
87
Freshness
82
Citations
70
Engagement
0