A Faster Algorithm for Solving a Generalization of Isotonic Median Regression and a Class of Fused Lasso Problems

Abstract

Many applications in the areas of production, signal processing, economics, bioinformatics, and statistical learning involve a given partial order on certain parameters and a set of noisy observations of the parameters. The goal is to derive estimated values of the parameters that satisfy the partial order that minimize the loss of deviations from the given observed values. A prominent application of this setup is the well-known isotonic regression problem where the partial order is a total order. A commonly studied isotonic regression problem is the isotonic median regression (IMR) where the loss function is a sum of absolute value functions on the deviations of the estimated values from the observation values. We present here the most efficient algorithm to date for a generalization of IMR with any convex piecewise linear loss function as well as for variant models that include L1 norm penalty separation/regularization functions in the objective on the violation of the total order constraints. Such penalty minimization problems that feature convex piecewise linear deviation functions and L1 norm separation/regularization functions include classes of nearly isotonic regression and fused lasso problems as special cases. The algorithm devised here is therefore a unified approach for solving the generalized problem and all known special cases with a complexity that is the most efficient known to date. We present an empirical study showing that our algorithm outperforms the run times of a linear programming software, for simulated data sets.

Publication
SIAM Journal on Optimization
Date