Tải bản đầy đủ (.pdf) (687 trang)

mixed integer nonlinear programming

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (7.67 MB, 687 trang )

For further volumes:
The IMA Volumes in Mathematics
and its Applications
Volume 154
/>Institute for Mathematics and
its Applications (IMA)
The Institute for Mathematics and its Applications was estab-
lished by a grant from the National Science Foundation to the University
of Minnesota in 1982. The primary mission of the IMA is to foster research
of a truly interdisciplinary nature, establishing links between mathematics
of the highest caliber and important scientific and technological problems
from other disciplines and industries. To this end, the IMA organizes a wide
variety of programs, ranging from short intense workshops in areas of ex-
ceptional interest and opportunity to extensive thematic programs lasting
a year. IMA Volumes are used to communicate results of these programs
that we believe are of particular value to the broader scientific community.
The full list of IMA books can be found at the Web site of the Institute
for Mathematics and its Applications:
/>Presentation materials from the IMA talks are available at
/>Video library is at
/>Fadil Santosa, Director of the IMA
**********
IMA ANNUAL PROGRAMS
1982–1983 Statistical and Continuum Approaches to Phase Transition
1983–1984 Mathematical Models for the Economics of Decentralized
Resource Allocation
1984–1985 Continuum Physics and Partial Differential Equations
1985–1986 Stochastic Differential Equations and Their Applications
1986–1987 Scientific Computation
1987–1988 Applied Combinatorics


1988–1989 Nonlinear Waves
1989–1990 Dynamical Systems and Their Applications
1990–1991 Phase Transitions and Free Boundaries
1991–1992 Applied Linear Algebra
Continued at the back
Jon Lee • Sven Leyffer
Mixed Integer Nonlinear
Programming

Editors

Printed on acid-free paper

Springer is part of Springer Science+Business Media (www.springer.com)
Jon Lee
University of Michigan
Ann Arbor, Michigan 48109
USA
Editors
Sven Leyffer
Mathematics and Computer Science
Argonne National Laboratory
Argonne, Illinois 60439
USA
Industrial and Operations Engineering
1205 Beal Avenue





ISSN
Springer New York Dordrecht Heidelberg London
ISBN 978-1-4614-1926-6
DOI 10.1007/978-1-4614-1927-3
e-ISBN 978-1-4614-1927-3
¤ Springer Science+Business Media, LLC 2012
0940-6573
Library of Congress Control Number: 2011942482
All rights reserved. This work may not be translated or copied in whole or in part without the written
permission of the publisher (Springer Science+Business Media, LLC, 233 Spring Street, New York,
NY 10013, USA), except for brief excerpts in connection with reviews or scholarly analysis. Use in
connection with any form of information storage and retrieval, electronic adaptation, computer
software, or by similar or dissimilar methodology now known or hereafter developed is forbidden.
The use in this publication of trade names, trademarks, service marks, and similar terms, even if they
are not identified as such, is not to be taken as an expression of opinion as to whether or not they are
subject to proprietary rights.
Mathematics Subject Classification (2010): 05C25, 20B25, 49J15, 49M15, 49M37, 49N90, 65K05,
90C10, 90C11, 90C22, 90C25, 90C26, 90C27, 90C30, 90C35, 90C51, 90C55, 90C57, 90C60,
90C90, 93C95
FOREWORD
This IMA Volume in Mathematics and its Applications
MIXED INTEGER NONLINEAR PROGRAMMING
contains expository and research papers based on a highly successful IMA
Hot Topics Workshop “Mixed-Integer Nonlinear Optimization: Algorith-
mic Advances and Applications”. We are grateful to all the participants
for making this occasion a very productive and stimulating one.
We would like to thank Jon Lee (Industrial and Operations Engineer-
ing, University of Michigan) and Sven Leyffer (Mathematics and Computer
Science, Argonne National Laboratory) for their superb role as program or-
ganizers and editors of this volume.

We take this opportunity to thank the National Science Foundation
for its support of the IMA.
Series Editors
Fadil Santosa, Director of the IMA
Markus Keel, Deputy Director of the IMA
v

PREFACE
Many engineering, operations, and scientific applications include a mixture
of discrete and continuous decision variables and nonlinear relationships
involving the decision variables that have a pronounced effect on the set
of feasible and optimal solutions. Mixed-integer nonlinear programming
(MINLP) problems combine the numerical difficulties of handling nonlin-
ear functions with the challenge of optimizing in the context of nonconvex
functions and discrete variables. MINLP is one of the most flexible model-
ing paradigms available for optimization; but because its scope is so broad,
in the most general cases it is hopelessly intractable. Nonetheless, an ex-
panding body of researchers and practitioners — including chemical en-
gineers, operations researchers, industrial engineers, mechanical engineers,
economists, statisticians, computer scientists, operations managers, and
mathematical programmers — are interested in solving large-scale MINLP
instances.
Of course, the wealth of applications that can be accurately mod-
eled by using MINLP is not yet matched by the capability of available
optimization solvers. Yet, the two key components of MINLP — mixed-
integer linear programming (MILP) and nonlinear programming (NLP) —
have experienced tremendous progress over the past 15 years. By cleverly
incorporating many theoretical advances in MILP research, powerful aca-
demic, open-source, and commercial solvers have paved the way for MILP
to emerge as a viable, widely used decision-making tool. Similarly, new

paradigms and better theoretical understanding have created faster and
more reliable NLP solvers that work well, even under adverse conditions
such as failure of constraint qualifications.
In the fall of 2008, a Hot-Topics Workshop on MINLP was organized
at the IMA, with the goal of synthesizing these advances and inspiring new
ideas in order to transform MINLP. The workshop attracted more than 75
attendees, over 20 talks, and over 20 posters. The present volume collects
22 invited articles, organized into nine sections on the diverse aspects of
MINLP. The volume includes survey articles, new research material, and
novel applications of MINLP.
In its most general and abstract form, a MINLP can be expressed as
minimize
x
f(x) subject to x ∈F, (1)
where f : R
n
→ R is a function and the feasible set F contains both non-
linear and discrete structure. We note that we do not generally assume
smoothness of f or convexity of the functions involved. Different realiza-
tions of the objective function f and the feasible set F give rise to key
classes of MINLPs addressed by papers in this collection.
vii
viii PREFACE
Part I. Convex MINLP. Even though mixed-integer optimization prob-
lems are nonconvex as a result of the presence of discrete variables, the
term convex MINLP is commonly used to refer to a class of MINLPs for
which a convex program results when any explicit restrictions of discrete-
ness on variables are relaxed (i.e., removed). In its simplest definition, for
a convex MINLP, we may assume that the objective function f in (1) is a
convex function and that the feasible set F is described by a set of convex

nonlinear function, c : R
n
→ R
m
, and a set of indices, I⊂{1, ,n},of
integer variables:
F = {x ∈ R
n
| c(x) ≤ 0, and x
i
∈ Z, ∀i ∈I}. (2)
Typically, we also demand some smoothness of the functions involved.
Sometimes it is useful to expand the definition of convex MINLP to sim-
ply require that the functions be convex on the feasible region. Besides
problems that can be directly modeled as convex MINLPs, the subject has
relevance to methods that create convex MINLP subproblems.
Algorithms and software for convex mixed-integer nonlinear programs
(P. Bonami, M. Kilin¸c, and J. Linderoth) discusses the state of the art for
algorithms and software aimed at convex MINLPs. Important elements of
successful methods include a tree search (to handle the discrete variables),
NLP subproblems to tighten linearizations, and MILP master problems to
collect and exploit the linearizations.
A special type of convex constraint is a second-order cone constraint:
y
2
≤ z,wherey is vector variable and z is a scalar variable. Subgradient-
based outer approximation for mixed-integer second-order cone program-
ming (S. Drewes and S. Ulbrich) demonstrates how such constraints can be
handled by using outer-approximation techniques. A main difficulty, which
the authors address using subgradients, is that at the point (y, z)=(0, 0),

the function y
2
is not differentiable.
Many convex MINLPs have “off/on” decisions that force a continuous
variable either to be 0 or to be in a convex set. Perspective reformula-
tion and applications (O. G¨unl¨uk and J. Linderoth) describes an effective
reformulation technique that is applicable to such situations. The perspec-
tive g(x, t )=tc(x/t) of a convex function c(x) is itself convex, and this
property can be used to construct tight reformulations. The perspective
reformulation is closely related to the subject of the next section: disjunc-
tive programming.
Part II. Disjunctive programming. Disjunctive programs involve con-
tinuous variable together with Boolean variables which model logical propo-
sitions directly rather than by means of an algebraic formulation.
Generalized disjunctive programming: A framework for formulation
and alternative algorithms for MINLP optimization (I.E. Grossmann and
J.P. Ruiz) addresses generalized disjunctive programs (GDPs), which are
MINLPs that involve general disjunctions and nonlinear terms. GDPs can
PREFACE ix
be formulated as MINLPs either through the “big-M” formulation, or by
using the perspective of the nonlinear functions. The authors describe
two approaches: disjunctive branch-and-bound, which branches on the dis-
junctions, and and logic-based outer approximation, which constructs a
disjunctive MILP master problem.
Under the assumption that the problem functions are factorable (i.e.,
the functions can be computed in a finite number of simple steps by us-
ing unary and binary operators), a MINLP can be reformulated as an
equivalent MINLP where the only nonlinear constraints are equations in-
volving two or three variables. The paper Disjunctive cuts for nonconvex
MINLP (P. Belotti) describes a procedure for generating disjunctive cuts.

First, spatial branching is performed on an original problem variable. Next,
bound reduction is applied to the two resulting relaxations, and linear
relaxations are created from a small number of outer approximations of
each nonlinear expression. Then a cut-generation LP is used to produce a
new cut.
Part III. Nonlinear programming. For several important and practical
approaches to solving MINLPs, the most important part is the fast and
accurate solution of NLP subproblems. NLPs arise both as nodes in branch-
and-bound trees and as subproblems for fixed integer or Boolean variables.
The papers in this section discuss two complementary techniques for solving
NLPs: active-set methods in the form of sequential quadratic programming
(SQP) methods and interior-point methods (IPMs).
Sequential quadratic programming methods (P.E. Gill and E. Wong)
is a survey of a key NLP approach, sequential quadratic programming
(SQP), that is especially relevant to MINLP. SQP methods solve NLPs by
a sequence of quadratic programming approximations and are particularly
well-suited to warm starts and re-solves that occur in MINLP.
IPMs are an alternative to SQP methods. However, standard IPMs
can stall if started near a solution, or even fail on infeasible NLPs, mak-
ing them less suitable for MINLP. Using interior-point methods within an
outer approximation framework for mixed-integer nonlinear programming
(H.Y. Benson) suggests a primal-dual regularization that penalizes the con-
straints and bounds the slack variables to overcome the difficulties caused
by warm starts and infeasible subproblems.
Part IV. Expression graphs. Expression graphs are a convenient way
to represent functions. An expression graph is a directed graph in which
each node represents an arithmetic operation, incoming edges represent op-
erations, and outgoing edges represent the result of the operation. Expres-
sion graphs can be manipulated to obtain derivative information, perform
problem simplifications through presolve operations, or obtain relaxations

of nonconvex constraints.
Using expression graphs in optimization algorithms (D.M. Gay) dis-
cusses how expression graphs allow gradients and Hessians to be computed
x PREFACE
efficiently by exploiting group partial separability. In addition, the author
describes how expression graphs can be used to tighten bounds on variables
to provide tighter outer approximations of nonconvex expressions, detect
convexity (e.g., for quadratic constraints), and propagate constraints.
Symmetry arises in many MINLP formulations and can mean that
a problem or subproblem may have many symmetric optima or near op-
tima, resulting in large search trees and inefficient pruning. Symmetry in
mathematical programming (L. Liberti) describes how the symmetry group
of a MINLP can be detected by parsing the expression graph. Once the
symmetry group is known, we can add symmetry-breaking constraints or
employ special branching schemes such as orbital branching that mitigate
the adverse effects of symmetry.
Part V. Convexification and linearization. A popular and classical
approach for handling nonconvex functions is to approximate them by using
piecewise-linear functions. This approach requires the addition of binary
variables that model the piecewise approximation. The advantage of such
an approach is that advanced MILP techniques can be applied. The disad-
vantage of the approach is that the approximations are not exact and that
it suffers from the curse of dimensionality.
Using piecewise linear functions for solving MINLPs (B. Geißler, A.
Martin,A.Morsi,andL.Schewe)details how to carry out piecewise-linear
approximation for MINLP. The authors review two formulations of piece-
wise linearization: the convex combination technique and the incremental
technique. They introduce a piecewise-polyhedral outer-approximation al-
gorithm based on rigorous error estimates, and they demonstrate compu-
tational success on water network and gas network problems.

A global-optimization algorithm for mixed-integer nonlinear programs
having separable nonconvexity (C. D’Ambrosio, J. Lee, and A. W¨achter)
introduces a method for MINLPs that have all of their nonconvexity in
separable form. The approach aims to retain and exploit existing convexity
in the formulation.
Global optimization of mixed-integer signomial programming problems
(A. Lundell and T. Westerlund) describes a global optimization algorithm
for MINLPs containing signomial functions. The method obtains a convex
relaxation through reformulations, by using single-variable transformations
in concert with piecewise-linear approximations of the inverse transforma-
tions.
Part VI. Mixed-integer quadratically-constrained optimization.
In seeking a more structured setting than general MINLP, but with consid-
erably more modeling power than is afforded by MILP, one naturally con-
siders mixed-integer models with quadratic functions, namely, MIQCPs.
Such models are NP-hard, but they have enough structure that can be
exploited in order to gain computational advantages over treating such
problems as general MINLPs.
PREFACE xi
The MILP road to MIQCP (S. Burer and A. Saxena) surveys re-
sults in mixed-integer quadratically constrained programming. Strong con-
vex relaxations and valid inequalities are the basis of efficient, practical
techniques for global optimization. Some of the relaxations and inequal-
ities are derived from the algebraic formulation, while others are based
on disjunctive programming. Much of the inspiration derives from MILP
methodology.
Linear programming relaxations of quadratically-constrained quadratic
programs (A. Qualizza, P. Belotti, and F. Margot) investigates the use
of LP tools for approximately solving semidefinite programming (SDP)
relaxations of quadratically-constrained quadratic programs. The authors

present classes of valid linear inequalities based on spectral decomposition,
together with computational results.
Extending a CIP framework to solve MIQCPs (T. Berthold, S. Heinz,
and S. Vigerske) discusses how to build a solver for MIQCPs by extending a
framework for constraint integer programming (CIP). The advantage of this
approach is that we can utilize the full power of advanced MILP and con-
straint programming technologies. For relaxation, the approach employs
an outer approximation generated by linearization of convex constraints
and linear underestimation of nonconvex constraints. Reformulation, sep-
aration, and propagation techniques are used to handle the quadratic con-
straints efficiently. The authors implemented these methods in the branch-
cut-and-price framework SCIP.
Part VII. Combinatorial optimization. Because of the success of
MILP methods and because of beautiful and algorithmically important re-
sults from polyhedral combinatorics, nonlinear functions and formulations
have not been heavily investigated for combinatorial optimization prob-
lems. With improvements in software for general NLP, SDP, and MINLP,
however, researchers are now investing considerable effort in trying to ex-
ploit these gains for combinatorial-optimization problems.
Computation with polynomial equations and inequalities arising in
combinatorial optimization (J.A. De Loera, P.N. Malkin, and P.A. Par-
rilo) discusses how the algebra of multivariate polynomials can be used to
create large-scale linear algebra or semidefinite-programming relaxations of
many kinds of combinatorial feasibility and optimization problems.
Matrix relaxations in combinatorial optimization (F. Rendl) discusses
the use of SDP as a modeling tool in combinatorial optimization. The
main techniques to get matrix relaxations of combinatorial-optimization
problems are presented. Semidefiniteness constraints lead to tractable re-
laxations, while constraints that matrices be completely positive or copos-
itive do not. This survey illustrates the enormous power and potential of

matrix relaxations.
A polytope for a product of real linear functions in 0/1 variables (O.
G¨unl¨uk,J.Lee,andJ.Leung)uses polyhedral methods to give a tight
xii PREFACE
formulation for the convex hull of a product of two linear functions in
0/1 variables. As an example, by writing a pair of general integer vari-
ables in binary expansion, the authors have a technique for linearizing their
product.
Part VIII. Complexity. General MINLP is incomputable, independent
of conjectures such as P=NP. From the point of view of complexity the-
ory, however, considerable room exists for negative results (e.g., incom-
putability, intractablility and inapproximability results) and positive re-
sults (e.g., polynomial-time algorithms and approximations schemes) for
restricted classes of MINLPs.
On the complexity of nonlinear mixed-integer optimization (M. K¨oppe)
is a survey on the computational complexity of MINLP. It includes incom-
putability results that arise from number theory and logic, fully polynomial-
time approximation schemes in fixed dimension, and polynomial-time al-
gorithms for special cases.
Theory and applications of n-fold integer programming (S. Onn) is
an overview of the theory of n-fold integer programming, which enables
the polynomial-time solution of fundamental linear and nonlinear inte-
ger programming problems in variable dimension. This framework yields
polynomial-time algorithms in several application areas, including multi-
commodity flows and privacy in statistical databases.
Part IX. Applications. A wide range of applications of MINLP exist.
This section focuses on two new application domains.
MINLP application for ACH interiors restructuring (E. Klampfl and
Y. Fradkin) describes a very large-scale application of MINLP developed
by the Ford Motor Company. The MINLP models the re-engineering of

42 product lines over 26 manufacturing processes and 50 potential supplier
sites. The resulting MINLP model has 350,000 variables (17,000 binary)
and 1.6 million constraints and is well beyond the size that state-of-the-art
MINLP solvers can handle. The authors develop a piecewise-linearization
scheme for the objective and a decomposition technique that decouples the
problem into two coupled MILPs that are solved iteratively.
A benchmark library of mixed-integer optimal control problems (S.
Sager) describes a challenging new class of MINLPs. These are optimal
control problems, involving differential-algebraic equation constraints and
integrality restrictions on the controls, such as gear ratios. The authors de-
scribe 12 models from a range of applications, including biology, industrial
engineering, trajectory optimization, and process control.
Acknowledgments. We gratefully acknowledge the generous financial
support from the IMA that made this workshop possible, as well as fi-
nancial support from IBM. This work was supported in part by the Of-
fice of Advanced Scientific Computing Research, Office of Science, U.S.
Department of Energy, under Contract DE-AC02-06CH11357. Special
PREFACE xiii
thanks are due to Fadil Santosa, Chun Liu, Patricia Brick, Dzung Nguyen,
Holly Pinkerton, and Eve Marofsky from the IMA, who made the organi-
zation of the workshop and the publication of this special volume such an
easy and enjoyable affair.
Jon Lee
University of Michigan
Sven Leyffer
Argonne National Laboratory

CONTENTS
Foreword v
Preface vii

Part I: Convex MINLP
Algorithms and software for convex mixed integer nonlinear
programs 1
Pierre Bonami, Mustafa Kilin¸c, and Jeff Linderoth
Subgradient based outer approximation for mixed integer second
ordercone programming 41
Sarah Drewes and Stefan Ulbrich
Perspective reformulation and applications 61
Oktay G¨unl¨uk and Jeff Linderoth
Part II: Disjunctive Programming
Generalized disjunctive programming: A framework for formulation
and alternative algorithms for MINLP optimization 93
Ignacio E. Grossmann and Juan P. Ruiz
Disjunctive cuts for nonconvexMINLP 117
Pietro Belotti
Part III: Nonlinear Programming
Sequential quadratic programmingmethods 147
Philip E. Gill and Elizabeth Wong
Using interior-point methods within an outer approximation
framework for mixed integer nonlinear programming . . . . . . . 225
Hande Y. Benson
Part IV: Expression Graphs
Using expression graphs in optimization algorithms . . . . . . . . . . . . 247
David M. Gay
Symmetry in mathematical programming 263
Leo Liberti
xv
xvi CONTENTS
Part V: Convexification and Linearization
Using piecewise linear functions for solving MINLPs . . . . . . . . . . . . . 287

Bj¨orn Geißler, Alexander Martin, Antonio Morsi, and Lars Schewe
An algorithmic framework for MINLP with separable
non-convexity 315
Claudia D’Ambrosio, Jon Lee, and Andreas W¨achter
Global optimization of mixed-integer signomial programming
problems 349
Andreas Lundell and Tapio Westerlund
Part VI: Mixed-Integer Quadraticaly
Constrained Optimization
The MILP roadto MIQCP 373
Samuel Burer and Anureet Saxena
Linear programming relaxations of quadratically constrained
quadratic programs 407
Andrea Qualizza, Pietro Belotti, and Fran¸cois Margot
Extending a CIP framework to solveMIQCPs 427
Timo Berthold, Stefan Heinz, and Stefan Vigerske
Part VII: Combinatorial Optimization
Computation with polynomial equations and inequalities arising
in combinatorial optimization 447
Jesus A. De Loera, Peter N. Malkin, and Pablo A. Parrilo
Matrix relaxations in combinatorial optimization 483
Franz Rendl
A polytope for a product of real linear functions in 0/1 variables. . . 513
Oktay G¨unl¨uk, Jon Lee, and Janny Leung
Part VIII: Complexity
On the complexity of nonlinear mixed-integer optimization . . . . . . . . . . 533
Matthias K¨oppe
Theory and applications of n -fold integer programming 559
Shmuel Onn
CONTENTS xvii

Part IX: Applications
MINLP Application for ACH interiors restructuring 597
Erica Klampfl and Yakov Fradkin
A benchmark library of mixed-integer optimal control problems. . . . 631
Sebastian Sager
List of Hot Topics participants 671

PART I:
Convex MINLP
ALGORITHMS AND SOFTWARE FOR
CONVEX MIXED INTEGER NONLINEAR PROGRAMS
PIERRE BONAMI

, MUSTAFA KILINC¸

, AND JEFF LINDEROTH

Abstract. This paper provides a survey of recent progress and software for solving
convex Mixed Integer Nonlinear Programs (MINLP)s, where the objective and con-
straints are defined by convex functions and integrality restrictions are imposed on a
subset of the decision variables. Convex MINLPs have received sustained attention in
recent years. By exploiting analogies to well-known techniques for solving Mixed Integer
Linear Programs and incorporating these techniques into software, significant improve-
ments have been made in the ability to solve these problems.
Key words. Mixed Integer Nonlinear Programming; Branch and Bound.
1. Introduction. Mixed-Integer Nonlinear Programs (MINLP)s are
optimization problems where some of the variables are constrained to take
integer values and the objective function and feasible region of the problem
are described by nonlinear functions. Such optimization problems arise in
many real world applications. Integer variables are often required to model

logical relationships, fixed charges, piecewise linear functions, disjunctive
constraints and the non-divisibility of resources. Nonlinear functions are
required to accurately reflect physical properties, covariance, and economies
of scale.
In full generality, MINLPs form a particularly broad class of challeng-
ing optimization problems, as they combine the difficulty of optimizing
over integer variables with the handling of nonlinear functions. Even if we
restrict our model to contain only linear functions, MINLP reduces to a
Mixed-Integer Linear Program (MILP), which is an NP-Hard problem [55].
On the other hand, if we restrict our model to have no integer variable but
allow for general nonlinear functions in the objective or the constraints,
then MINLP reduces to a Nonlinear Program (NLP) which is also known
to be NP-Hard [90]. Combining both integrality and nonlinearity can lead
to examples of MINLP that are undecidable [67].

Laboratoire d’Informatique Fondamentale de Marseille, CNRS, Aix-Marseille Uni-
versit´es, Parc Scientifique et Technologique de Luminy, 163 avenue de Luminy - Case
901, F-13288 Marseille Cedex 9, France (). Supported
by ANR grand BLAN06-1-138894.

Department of Industrial and Systems Engineering, University of Wisconsin-
Madison, 1513 University Ave., Madison, WI, 53706 ().

Department of Industrial and Systems Engineering, University of Wisconsin-
Madison, 1513 University Ave., Madison, WI 53706 (). The work of
the second and third authors is supported by the US Department of Energy under grants
DE-FG02-08ER25861 and DE-FG02-09ER25869, and the National Science Foundation
under grant CCF-0830153.

1

J. Lee and S. Leyffer (eds.), Mixed Integer Nonlinear Programming, The IMA Volumes
© Springer Science+Business Media, LLC 2012
in Mathematics and its Applications 154, DOI 10.1007/978-1-4614-1927-3_1,
2 PIERRE BONAMI, MUSTAFA KILINC¸ , AND JEFF LINDEROTH
In this paper, we restrict ourselves to the subclass of MINLP where
the objective function to minimize is convex, and the constraint functions
are all convex and upper bounded. In these instances, when integrality is
relaxed, the feasible set is convex. Convex MINLP is still NP-hard since it
contains MILP as a special case. Nevertheless, it can be solved much more
efficiently than general MINLP since the problem obtained by dropping
the integrity requirements is a convex NLP for which there exist efficient
algorithms. Further, the convexity of the objective function and feasible
region can be used to design specialized algorithms.
There are many diverse and important applications of MINLPs. A
small subset of these applications includes portfolio optimization [21, 68],
block layout design in the manufacturing and service sectors [33, 98], net-
work design with queuing delay constraints [27], integrated design and con-
trol of chemical processes [53], drinking water distribution systems security
[73], minimizing the environmental impact of utility plants [46], and multi-
period supply chain problems subject to probabilistic constraints [75].
Even though convex MINLP is NP-Hard, there are exact methods for
its solution—methods that terminate with a guaranteed optimal solution
or prove that no such solution exists. In this survey, our main focus is on
such exact methods and their implementation.
In the last 40 years, at least five different algorithms have been pro-
posed for solving convex MINLP to optimality. In 1965, Dakin remarked
that the branch-and-bound method did not require linearity and could be
applied to convex MINLP. In the early 70’s, Geoffrion [56] generalized Ben-
ders decomposition to make an exact algorithm for convex MINLP. In the
80’s, Gupta and Ravindran studied the application of branch and bound

[62]. At the same time, Duran and Grossmann [43] introduced the Outer
Approximation decomposition algorithm. This latter algorithm was subse-
quently improved in the 90’s by Fletcher and Leyffer [51] and also adapted
to the branch-and-cut framework by Quesada and Grossmann [96]. In the
same period, a related method called the Extended Cutting Plane method
was proposed by Westerlund and Pettersson [111]. Section 3 of this paper
will be devoted to reviewing in more detail all of these methods.
Two main ingredients of the above mentioned algorithms are solving
MILP and solving NLP. In the last decades, there have been enormous
advances in our ability to solve these two important subproblems of convex
MINLP.
We refer the reader to [100, 92] and [113] for in-depth analysis of the
theory of MILP. The advances in the theory of solving MILP have led to
the implementation of solvers both commercial and open-source which are
now routinely used to solve many industrial problems of large size. Bixby
and Rothberg [22] demonstrate that advances in algorithmic technology
alone have resulted in MILP instances solving more than 300 times faster
than a decade ago. There are effective, robust commercial MILP solvers
ALGORITHMS AND SOFTWARE FOR CONVEX MINLP 3
such as CPLEX [66], XPRESS-MP [47], and Gurobi [63]. Linderoth and
Ralphs [82] give a survey of noncommercial software for MILP.
There has also been steady progress over the past 30 years in the de-
velopment and successful implementation of algorithms for NLPs. We refer
the reader to [12] and [94] for a detailed recital of nonlinear programming
techniques. Theoretical developments have led to successful implemen-
tations in software such as SNOPT [57], filterSQP [52], CONOPT [42],
IPOPT [107], LOQO [103], and KNITRO [32]. Waltz [108] states that the
size of instance solvable by NLP is growing by nearly an order of magnitude
a decade.
Of course, solution algorithms for convex MINLP have benefit from

the technological progress made in solving MILP and NLP. However, in the
realm of MINLP, the progress has been far more modest, and the dimension
of solvable convex MINLP by current solvers is small when compared to
MILPs and NLPs. In this work, our goal is to give a brief introduction to
the techniques which are in state-of-the-art solvers for convex MINLPs. We
survey basic theory as well as recent advances that have made their way
into software. We also attempt to make a fair comparison of all algorithmic
approaches and their implementations.
The remainder of the paper can be outlined as follows. A precise de-
scription of a MINLP and algorithmic building blocks for solving MINLPs
are given in Section 2. Section 3 outlines five different solution techniques.
In Section 4, we describe in more detail some advanced techniques imple-
mented in the latest generation of solvers. Section 5 contains descriptions of
several state-of-the-art solvers that implement the different solution tech-
niques presented. Finally, in Section 6 we present a short computational
comparison of those software packages.
2. MINLP. The focus of this section is to mathematically define a
MINLP and to describe important special cases. Basic elements of algo-
rithms and subproblems related to MINLP are also introduced.
2.1. MINLP problem classes. A Mixed Integer Nonlinear Program
may be expressed in algebraic form as follows:
z
minlp
= minimize f(x)
subject to g
j
(x) ≤ 0 ∀j ∈ J, (MINLP)
x ∈ X, x
I
∈ Z

|I|
,
where X is a polyhedral subset of R
n
(e.g. X = {x | x ∈ R
n
+
,Ax ≤ b}).
The functions f : X → R and g
j
: X → R are sufficiently smooth functions.
The algorithms presented here only require continuously differentiable func-
tions, but in general algorithms for solving continuous relaxations converge
much faster if functions are twice-continuously differentiable. The set J is
the index set of nonlinear constraints, I is the index set of discrete variables
and C is the index set of continuous variables, so I ∪C = {1, ,n}.
4 PIERRE BONAMI, MUSTAFA KILINC¸ , AND JEFF LINDEROTH
For convenience, we assume that the set X is bounded; in particular
some finite lower bounds L
I
and upper bounds U
I
on the values of the
integer variables are known. In most applications, discrete variables are
restricted to 0-1 values, i.e., x
i
∈{0, 1}∀i ∈ I. In this survey, we focus on
the case where the functions f and g
j
are convex. Thus, by relaxing the

integrality constraint on x, a convex program, minimization of a convex
function over a convex set, is formed. We will call such problems convex
MINLPs. From now on, unless stated, we will refer convex MINLPs as
MINLPs.
There are a number of important special cases of MINLP. If f(x)=
x
T
Qx + d
T
x + h, is a (convex) quadratic function of x, and there are only
linear constraints on the problem (J = ∅), the problem is known as a mixed
integer quadratic program (MIQP). If both f(x) and g
j
(x) are quadratic
functions of x for each j ∈ J, the problem is known as a mixed integer
quadratically constrained program (MIQCP). Significant work was been
devoted to these important special cases [87, 29, 21].
If the objective function is linear, and all nonlinear constraints have
the form g
j
(x)=Ax + b
2
−c
T
x −d, then the problem is a Mixed Integer
Second-Order Cone Program (MISOCP). Through a well-known transfor-
mation, MIQCP can be transformed into a MISOCP. In fact, many different
types of sets defined by nonlinear constraints are representable via second-
order cone inequalities. Discussion of these transformations is out of the
scope of this work, but the interested reader may consult [15]. Relatively

recently, commercial software packages such as CPLEX [66], XPRESS-MP
[47], and Mosek [88] have all been augmented to include specialized al-
gorithms for solving these important special cases of convex MINLPs. In
what follows, we focus on general convex MINLP and software available
for its solution.
2.2. Basic elements of MINLP methods. The basic concept un-
derlying algorithms for solving (MINLP) is to generate and refine bounds
on its optimal solution value. Lower bounds are generated by solving a
relaxation of (MINLP), and upper bounds are provided by the value of
a feasible solution to (MINLP). Algorithms differ in the manner in which
these bounds are generated and the sequence of subproblems that are solved
to generate these bounds. However, algorithms share many basic common
elements, which are described next.
Linearizations: Since the objective function of (MINLP) may be non-
linear, its optimal solution may occur at a point that is interior to the
convex hull of its set of feasible solutions. It is simple to transform the
instance to have a linear objective function by introducing an auxiliary
variable η and moving the original objective function into the constraints.
Specifically, (MINLP) may be equivalently stated as
ALGORITHMS AND SOFTWARE FOR CONVEX MINLP 5
z
minlp
= minimize η
subject to f(x) ≤ η
g
j
(x) ≤ 0 ∀j ∈ J, (MINLP-1)
x ∈ X, x
I
∈ Z

|I|
.
Many algorithms rely on linear relaxations of (MINLP), obtained by
linearizing the objective and constraint functions at a given point ˆx. Since
f and g
j
are convex and differentiable, the inequalities
f(ˆx)+∇f(ˆx)
T
(x − ˆx) ≤ f(x),
g
j
(ˆx)+∇g
j
(ˆx)
T
(x − ˆx) ≤ g
j
(x),
are valid for all j ∈ J and ˆx ∈ R
n
. Since f(x) ≤ η and g
j
(x) ≤ 0, then the
linear inequalities
f(ˆx)+∇f(ˆx)
T
(x − ˆx) ≤ η, (2.1)
g
j

(ˆx)+∇g
j
(ˆx)
T
(x − ˆx) ≤ 0 (2.2)
are valid for (MINLP-1). Linearizations of g
j
(x) outer approximate the
feasible region, and linearizations of f(x) underestimate the objective func-
tion. We often refer to (2.1)–(2.2) as outer approximation constraints.
Subproblems: One important subproblem used by a variety of algo-
rithms for (MINLP) is formed by relaxing the integrity requirements and
restricting the bounds on the integer variables. Given bounds (l
I
,u
I
)=
{(
i
,u
i
) |∀i ∈ I}, the NLP relaxation of (MINLP) is
z
nlpr(l,u)
= minimize f(x)
subject to g
j
(x) ≤ 0 ∀j ∈ J, (NLPR(l
I
, u

I
))
x ∈ X; l
I
≤ x
I
≤ u
I
.
The value z
nlpr(l,u)
is a lower bound on the value of z
minlp
that can be
obtained in the subset of the feasible region of (MINLP) where the bounds

I
≤ x
I
≤ u
I
are imposed. Specifically, if (l
I
,u
I
) are the lower and upper
bounds (L
I
,U
I

) for the original instance, then z
NLPR(L
I
,U
I
)
provides a
lower bound on z
minlp
.
In the special case that all of the integer variables are fixed (l
I
= u
I
=
ˆx
I
), the fixed NLP subproblem is formed:
z
NLP(ˆx
I
)
= minimize f(x)
subject to g
j
(x) ≤ 0, ∀j ∈ J (NLP(ˆx
I
))
x ∈ X; x
I

=ˆx
I
.
If ˆx
I
∈ Z
|I|
and (NLP(ˆx
I
)) has a feasible solution, the value z
NLP(ˆx
I
)
pro-
vides an upper bound to the problem (MINLP). If (NLP(ˆx
I
)) is infeasible,

×