Interpretable ML package ๐ for concise, transparent, and accurate predictive modeling (sklearn-compatible).
Python package for concise, transparent, and accurate predictive modeling. All sklearn-compatible and easily customizable.
docs โข imodels overview โข demo notebooks
Implementations of different popular interpretable models can be easily used and installed:
from imodels import BayesianRuleListClassifier, GreedyRuleListClassifier, SkopeRulesClassifier from imodels import SLIMRegressor, RuleFitRegressormodel = BayesianRuleListClassifier() # initialize a model model.fit(X_train, y_train) # fit model preds = model.predict(X_test) # discrete predictions: shape is (n_test, 1) preds_proba = model.predict_proba(X_test) # predicted probabilities: shape is (n_test, n_classes) print(model) # print the rule-based model
if X1 > 5: then 80.5% risk
else if X2 > 5: then 40% risk
else: 10% risk
Install with
pip install imodels(see here for help). Contains the following models:
| Model | Reference | Description | | :-------------------------- | ------------------------------------------------------------ | ------------------------------------------------------------ | | Rulefit rule set | ๐๏ธ, ๐, ๐ | Extracts rules from a decision tree then builds a sparse linear model with them | | Skope rule set | ๐๏ธ, ๐ | Extracts rules from gradient-boosted trees, deduplicates them, then forms a linear combination of them based on their OOB precision | | Boosted rule set | ๐๏ธ, ๐, ๐ | Uses Adaboost to sequentially learn a set of rules | | Bayesian rule list | ๐๏ธ, ๐, ๐ | Learns a compact rule list by sampling rule lists (rather than using a greedy heuristic) | | Greedy rule list | ๐๏ธ, ๐ | Uses CART to learn a list (only a single path), rather than a decision tree | | OneR rule list | ๐๏ธ, ๐ | Learns rule list restricted to only one feature | | Optimal rule tree | ๐๏ธ, ๐, ๐ | (In progress) Learns succinct trees using global optimization rather than greedy heuristics | | Iterative random forest | ๐๏ธ, ๐, ๐ | (In progress) Repeatedly fit random forest, giving features with high importance a higher chance of being selected. | | Sparse integer linear model | ๐๏ธ, ๐ | Forces coefficients to be integers | | Rule sets | โ | (Coming soon) Many popular rule sets including SLIPPER, Lightweight Rule Induction, MLRules |
Docs ๐๏ธ, Reference code implementation ๐, Research paper ๐
More models coming soon!
The final form of the above models takes one of the following forms, which aim to be simultaneously simple to understand and highly predictive:
| Rule set | Rule list | Rule tree | Algebraic models | | :----------------------------------------------------------: | :-----------------------------------------------------: | :-----------------------------------------------------: | :----------------------------------------------------------: | | | | | |
Different models and algorithms vary not only in their final form but also in different choices made during modeling. In particular, many models differ in the 3 steps given by the table below.
See the docs for individual models for futher descriptions.
| Rule candidate generation | Rule selection | Rule pruning / combination | | :----------------------------------------------------------: | :--------------------------------------------------------: | :-------------------------------------------------------: | | | | |
The code here contains many useful and customizable functions for rule-based learning in the util folder. This includes functions / classes for rule deduplication, rule screening, and converting between trees, rulesets, and neural networks.
Demos are contained in the notebooks folder.
imodelsfor deriving a clinical decision rule
Different models support different machine-learning tasks. Current support for different models is given below:
| Model | Binary classification | Multi-class classification | Regression | | :-------------------------- | :-------------------: | :------------------------: | :--------: | | Rulefit rule set | โ๏ธ | | โ๏ธ | | Skope rule set | โ๏ธ | | | | Boosted rule set | โ๏ธ | | | | Bayesian rule list | โ๏ธ | | | | Greedy rule list | โ๏ธ | | | | OneR rule list | โ๏ธ | | | | Optimal rule tree | | | | | Iterative random forest | | | | | Sparse integer linear model | | | โ๏ธ |