Press and Publications

A selection of coverage about our work with interpretability

Machine Learning under a Modern Optimization Lens

Our algorithms form the core of the recent graduate-level textbook Machine Learning Under A Modern Optimization Lens by co-founders Bertsimas and Dunn. This book details the transformative effect modern optimization is bringing to the fields of machine learning and artificial intelligence, and is guiding teaching at leading universities like MIT.

The book received the Frederick W. Lanchester Prize in 2021, awarded for the best contribution to operations research and the management sciences published in the last five years.

Selected methodological papers

A selection of the academic papers pioneering our algorithms

Optimal Classification Trees

Dimitris Bertsimas and Jack Dunn

Machine Learning, 2017

The original publication by the co-founders pioneering Optimal Trees. The paper developed the first scalable mixed-integer optimization formulation for training optimal decision trees, and presents empirical results that such trees outperform classical methods such as CART.

Optimal Prescriptive Trees

Dimitris Bertsimas, Jack Dunn, and Nishanth Mundru

INFORMS Journal on Optimization, 2019

This paper extends the optimal trees optimization framework to the field of prescriptive decision making. The resulting Optimal Prescriptive Trees learn how to prescribe directly from observational data, and perform competitively with the best black-box methods for the same task.

Optimal Survival Trees

Dimitris Bertsimas, Jack Dunn, Emma Gibson, and Agni Orfanoudaki

Machine Learning, 2021.

The optimal trees optimization framework is extended to the task of survival analysis. Optimal Survival Trees learn factors that affect survival over a continuous time period, with direct applications to healthcare and predictive maintenance.

Optimal Policy Trees

Maxime Amram, Jack Dunn, and Ying Daisy Zhuo

Machine Learning, under review.

Optimal Policy Trees combines methods from the causal inference literature with global optimality under the Optimal Trees framework. This method yields interpretable prescription policies, is highly scalable, handles both discrete and continuous treatments, and has shown superior performance in multiple expriments.

Sparse high-dimensional regression: Exact scalable algorithms and phase transitions

Dimitris Bertsimas and Bart Van Parys

The Annals of Statistics, 2020

The original publication by our co-founder pioneering Optimal Feature Selection. This paper presents the first scalable approach to exact subset selection in the linear regression problem, and demonstrates superior empirical results compared to existing heuristic approaches.

Sparse Regression: Scalable algorithms and empirical performance

Dimitris Bertsimas, Jean Pauphilet and Bart Van Parys

Statistical Science, 2020

This paper develops an extremely scalable heuristic for solving the Optimal Feature Selection problem that is significantly faster than the original approach, without sacrificing performance.

Sparse classification and phase transitions: A discrete optimization perspective

Dimitris Bertsimas, Jean Pauphilet and Bart Van Parys

Preprint available on arXiv

The Optimal Feature Selection methodology is extended to classification problems, namely logistic regression and support vector machines. Experiments demonstrate the superior performance compared to alternative methods.

From predictive methods to missing data imputation: An optimization approach.

Dimitris Bertsimas, Colin Pawlowski, and Daisy Zhuo

The Journal of Machine Learning Research, 2017

The original publication by the co-founders pioneering Optimal Imputation. The paper formulates the missing data imputation problem as a joint optimization problem and presents a scalable method to solve it to optimality, establishing superior performance to the state of the art.

Interpretable Matrix Completion: A Discrete Optimization Approach

Dimitris Bertsimas and Michael Li

Preprint available on arXiv

The original publication by the co-founders pioneering Interpretable Matrix Completion. The paper uses mixed-integer optimization to formulate the problem of creating an interpretable factorization of a matrix with side information, leading to simple and intuitive recommendation systems.

Fast Exact Matrix Completion: A Unified Optimization Framework for Matrix Completion

Dimitris Bertsimas and Michael Li

Journal of Machine Learning Research, 2020

An extension of Interpretable Matrix Completion to develop a fast and scalable stochastic algorithm for solving the matrix completion problem both with and without side information.

The Voice of Optimization

Dimitris Bertsimas and Bartolomeo Stellato

Machine Learning, 2021

The paper uses Optimal Classification Trees to understand and generalize the logic behind the optimal solutions to continuous and mixed-integer optimization problems, finding solutions in real time much faster than traditional approaches with very little sacrifice in optimality.

Sparse Regression over Clusters: SparClur

Dimitris Bertsimas, Jack Dunn, Lea Kapelevich, and Rebecca Zhang

Optimization Letters, 2021

A sparse version of Optimal Regression Trees with linear predictions where the regression features used in all the leaves are from a common set under a global sparsity constraint. This leads to more interpretable models with competitive performance.

Want to try Interpretable AI software?
We provide free academic licenses and evaluation licenses for commercial use.
We also offer consulting services to develop interpretable solutions to your key problems.

© 2020 Interpretable AI, LLC. All rights reserved.