Newton-type algorithms for inverse optimization I: weighted bottleneck Hamming distance and ℓ_∞-norm objectives
In minimum-cost inverse optimization problems, we are given a feasible solution to an underlying optimization problem together with a linear cost function, and the goal is to modify the costs by a small deviation vector so that the input solution becomes optimal. The difference between the new and the original cost functions can be measured in several ways. In this paper, we focus on two objectives: the weighted bottleneck Hamming distance and the weighted ℓ_∞-norm. We consider a general model in which the coordinates of the deviation vector are required to fall within given lower and upper bounds. For the weighted bottleneck Hamming distance objective, we present a simple, purely combinatorial algorithm that determines an optimal deviation vector in strongly polynomial time. For the weighted ℓ_∞-norm objective, we give a min-max characterization for the optimal solution, and provide a pseudo-polynomial algorithm for finding an optimal deviation vector that runs in strongly polynomial time in the case of unit weights. For both objectives, we assume that an algorithm with the same time complexity for solving the underlying combinatorial optimization problem is available. For both objectives, we also show how to extend the results to inverse optimization problems with multiple cost functions.
READ FULL TEXT