Learn R Programming

⚠️There's a newer version (0.11.4) of this package.Take me there.

iml (version 0.6.0)

Interpretable Machine Learning

Description

Interpretability methods to analyze the behavior and predictions of any machine learning model. Implemented methods are: Feature importance described by Fisher et al. (2018) , partial dependence plots described by Friedman (2001) , individual conditional expectation ('ice') plots described by Goldstein et al. (2013) , local models (variant of 'lime') described by Ribeiro et. al (2016) , the Shapley Value described by Strumbelj et. al (2014) , feature interactions described by Friedman et. al and tree surrogate models.

Copy Link

Version

Install

install.packages('iml')

Monthly Downloads

6,948

Version

0.6.0

License

MIT + file LICENSE

Issues

Pull Requests

Stars

Forks

Maintainer

Christoph Molnar

Last Published

August 17th, 2018

Functions in iml (0.6.0)

predict.LocalModel

Predict LocalModel
plot.Shapley

Plot Shapley
Partial

Partial Dependence and Individual Conditional Expectation
plot.TreeSurrogate

Plot Tree Surrogate
plot.Partial

Plot Partial Dependence
plot.LocalModel

Plot Local Model
predict.TreeSurrogate

Predict Tree Surrogate
TreeSurrogate

Decision tree surrogate model
LocalModel

LocalModel
plot.FeatureImp

Plot Feature Importance
plot.Interaction

Plot Interaction
FeatureImp

Feature importance
iml-package

Make machine learning models and predictions interpretable
Shapley

Prediction explanations with game theory
Predictor

Predictor object
Interaction

Feature interactions