116,99 €
Recent developments in multi-parametric optimization and control Multi-Parametric Optimization and Control provides comprehensive coverage of recent methodological developments for optimal model-based control through parametric optimization. It also shares real-world research applications to support deeper understanding of the material. Researchers and practitioners can use the book as reference. It is also suitable as a primary or a supplementary textbook. Each chapter looks at the theories related to a topic along with a relevant case study. Topic complexity increases gradually as readers progress through the chapters. The first part of the book presents an overview of the state-of-the-art multi-parametric optimization theory and algorithms in multi-parametric programming. The second examines the connection between multi-parametric programming and model-predictive control--from the linear quadratic regulator over hybrid systems to periodic systems and robust control. The third part of the book addresses multi-parametric optimization in process systems engineering. A step-by-step procedure is introduced for embedding the programming within the system engineering, which leads the reader into the topic of the PAROC framework and software platform. PAROC is anintegrated framework and platform for the optimization and advanced model-based control of process systems. * Uses case studies to illustrate real-world applications for a better understanding of the concepts presented * Covers the fundamentals of optimization and model predictive control * Provides information on key topics, such as the basic sensitivity theorem, linear programming, quadratic programming, mixed-integer linear programming, optimal control of continuous systems, and multi-parametric optimal control An appendix summarizes the history of multi-parametric optimization algorithms. It also covers the use of the parametric optimization toolbox (POP), which is comprehensive software for efficiently solving multi-parametric programming problems.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 322
Veröffentlichungsjahr: 2020
Cover
Wiley Series in
Title Page
Copyright
dedication-page
Short Bios of the Authors
Efstratios N. Pistikopoulos
Nikolaos A. Diangelakis
Richard Oberdieck
Preface
Part I Multi‐parametric Optimization
1 Introduction
1.1 Concepts of Optimization
1.2 Concepts of Multi‐parametric Programming
1.3 Polytopes
1.4 Organization of the Book
References
Notes
2 Multi‐parametric Linear Programming
2.1 Solution Properties
2.2 Degeneracy
2.3 Critical Region Definition
2.4 An Example: Chicago to Topeka
2.5 Literature Review
References
Notes
3 Multi‐Parametric Quadratic Programming
3.1 Calculation of the Parametric Solution
3.2 Solution Properties
3.3 Chicago to Topeka with Quadratic Distance Cost
3.4 Literature Review
References
Notes
4 Solution Strategies for mp‐LP and mp‐QP Problems
4.1 General Overview
4.2 The Geometrical Approach
4.3 The Combinatorial Approach
4.4 The Connected‐Graph Approach
4.5 Discussion
4.6 Literature Review
References
Notes
5 Multi‐parametric Mixed‐integer Linear Programming
5.1 Solution Properties
5.2 Comparing the Solutions from Different mp‐LP Problems
5.3 Multi‐parametric Integer Linear Programming
5.4 Chicago to Topeka Featuring a Purchase Decision
5.5 Literature Review
References
Notes
6 Multi‐parametric Mixed‐integer Quadratic Programming
6.1 Solution Properties
6.2 Comparing the Solutions from Different mp‐QP Problems
6.3 Envelope of Solutions
6.4 Chicago to Topeka Featuring Quadratic Cost and A Purchase Decision
6.5 Literature Review
References
Notes
7 Solution Strategies for mp‐MILP and mp‐MIQP Problems
7.1 General Framework
7.2 Global Optimization
7.3 Branch‐and‐Bound
7.4 Exhaustive Enumeration
7.5 The Comparison Procedure
7.6 Discussion
7.7 Literature Review
References
Notes
8 Solving Multi‐parametric Programming Problems Using MATLAB®
8.1 An Overview over the Functionalities of POP
8.2 Problem Solution
8.3 Problem Generation
8.4 Problem Library
8.5 Graphical User Interface (GUI)
8.6 Computational Performance for Test Sets
8.7 Discussion
Acknowledgments
References
Notes
9 Other Developments in Multi‐parametric Optimization
9.1 Multi‐parametric Nonlinear Programming
9.2 Dynamic Programming via Multi‐parametric Programming
9.3 Multi‐parametric Linear Complementarity Problem
9.4 Inverse Multi‐parametric Programming
9.5 Bilevel Programming Using Multi‐parametric Programming
9.6 Multi‐parametric Multi‐objective Optimization
References
Notes
Part II Multi‐parametric Model Predictive Control
10 Multi‐parametric/Explicit Model Predictive Control
10.1 Introduction
10.2 From Transfer Functions to Discrete Time State‐Space Models
10.3 From Discrete Time State‐Space Models to Multi‐parametric Programming
10.4 Explicit LQR – An Example of mp‐MPC
10.5 Size of the Solution and Online Computational Effort
References
Notes
11 Extensions to Other Classes of Problems
11.1 Hybrid Explicit MPC
11.2 Disturbance Rejection
11.3 Reference Trajectory Tracking
11.4 Moving Horizon Estimation
11.5 Other Developments in Explicit MPC
References
Notes
12 PAROC: PARametric Optimization and Control
12.1 Introduction
12.2 The PAROC Framework
12.3 Case Study: Distillation Column
12.4 Case Study: Simple Buffer Tank
12.5 The Tank Example
12.6 Concluding Remarks
References
Notes
AAppendix for the mp‐MPC Chapter 10
Appendix for the mp‐MPC Chapter 11
B.1 Matrices for the mp‐QP Problem Corresponding to the Example of Section 11.3.2
Index
End User License Agreement
Chapter 2
Table 2.1 The supply and demand of each plant and market in cases.
Table 2.2 The distances between each plant and market in thousands of miles.
Table 2.3 The solution of problem (2.20).
Chapter 3
Table 3.1 The solution of problem (3.14)
Chapter 5
Table 5.1 The solution of problem (5.16).
Chapter 6
Table 6.1 The partial solution of problem (6.9). Note that the critical regio...
Chapter 7
Table 7.1 An overview over the literature on how to solve mp‐MILP and mp‐MIQP...
Chapter 10
Table 10.1 State expressions as a function of the initial state values and in...
Table 10.2 The partial analytical solution of the mp‐MPC problem.
Table 10.3 Comparison of optimal unconstrained feedback law (via
) and optima...
Chapter 11
Table 11.1 The partial analytical solution of the mp‐MPC problem.
Table 11.2 The solution of a
of the mp‐MPC problem with disturbance.
Table 11.3 The partial analytical solution of the mp‐MPC problem.
Chapter 12
Table 12.1 Interaction of design and control – indicative list.
Table 12.2 Integration of scheduling and control – indicative list.
Table 12.3 Model approximation of multi‐parametric model‐predictive control –...
Table 12.4 Different classes of multi‐parametric programming problems encount...
Table 12.5 Parameter values for the distillation column model.
Table 12.6 Weight tuning for the mp‐MPC of the tank.
Chapter 1
Figure 1.1 A schematic representation of a two‐dimensional polytope
.
Figure 1.2 A schematic representation of the differences between two polytop...
Figure 1.3 A schematic representation of (a) strongly and (b) weakly redunda...
Chapter 2
Figure 2.1 The difference between the solution of an LP and an mp‐LP problem...
Figure 2.2 A schematic representation of the solution of the mp‐LP problem f...
Figure 2.3 A schematic representation of the connected‐graph theorem, (a) fr...
Figure 2.4 Primal and dual degeneracy in linear programming. In (a), primal ...
Chapter 3
Figure 3.1 The difference between the solution of a QP and an mp‐QP problem ...
Figure 3.2 A schematic representation of the solution of the mp‐QP problem f...
Figure 3.3 A schematic representation of some of the differences between (a)...
Figure 3.4 A schematic representation of the connected‐graph theorem, (a) fr...
Figure 3.5 The solution to problem 3.14. In (a) the partitioning of the para...
Chapter 4
Figure 4.1 A schematic representation between the different exploration stra...
Figure 4.2 A schematic representation of the failure of the variable step‐si...
Figure 4.3 A schematic representation of the branch‐and‐bound algorithm resu...
Chapter 5
Figure 5.1 The interpretation of mp‐MILP problems as a combination of severa...
Figure 5.2 The schematic representation of the example of a comparison proce...
Figure 5.3 The detection of the intersection between two critical regions,
Figure 5.4 The partitioning of
into
and
based on Eq. (5.7).
Figure 5.5 The solution of problem (5.16). In (a) the partitioning of the pa...
Chapter 6
Figure 6.1 The interpretation of mp‐MIQP problems as a combination of severa...
Figure 6.2 The schematic representation of the example of a comparison proce...
Figure 6.3 The detection of the intersection between two critical regions,
Figure 6.4 The difference in the partitioning of a critical region due to co...
Figure 6.5 The schematic representation of the envelopes of solutions. In (a...
Figure 6.6 The solution of problem (6.9). In (a) the partitioning of the par...
Chapter 7
Figure 7.1 The general framework for the solution of mp‐MIQP problems.
Figure 7.2 A schematic representation of the global optimization approach fo...
Figure 7.3 A schematic example of a branch‐and‐bound tree.
Figure 7.4 A schematic perspective on the different comparison procedures, w...
Chapter 8
Figure 8.1 The problem statistics of the test sets “POP_mpLP1” and “POP_...
Figure 8.2 The problem statistics of the test sets “POP_mpMILP1” and “PO...
Figure 8.3 The structure of the graphical user interface (GUI) of POP.
Figure 8.4 The performance of the geometrical, combinatorial, and connected‐...
Figure 8.5 The analysis of the computational effort spent on different aspec...
Figure 8.6 The analysis of the computational effort spent on different aspec...
Figure 8.7 The performance of the decomposition algorithm for the different ...
Figure 8.8 The analysis of the computational effort spent on different compa...
Figure 8.9 The analysis of the computational effort spent on different compa...
Chapter 10
Figure 10.1 The concept of the receding horizon. A set of
manipulated inpu...
Figure 10.2 The solution of the mp‐MPC example problem. (a) The partitioning...
Chapter 11
Figure 11.1 The solution of the mp‐MPC example problem. Left: (a) The partit...
Figure 11.2 Comparison of the feasible regions between (a) the hybrid mp‐MPC...
Figure 11.3 The solution of the mp‐MPC with disturbance example problem. (a)...
Figure 11.4 The solution of the mp‐MPC with disturbance example problem. (a)...
Figure 11.5 Indicative snapshots of the solution of the explicit reference t...
Figure 11.6 Indicative snapshots of the solution of the explicit reference t...
Chapter 12
Figure 12.1 A step‐by‐step outline of the PAROC framework.
Figure 12.2 An outline of the model approximation procedure typically used w...
Figure 12.3 Software interactions for the implementation of the closed‐loop ...
Figure 12.4 Software interactions for the implementation of the closed‐loop ...
Figure 12.5 Graphical representation of the binary distillation column examp...
Figure 12.6 Critical regions for the solution of the mp‐MHE problem of the b...
Figure 12.7 Critical regions for the solution of the mp‐MPC problem of the b...
Figure 12.8 Closed loop validation results. Both the mp‐MHE and mp‐MPC schem...
Figure 12.9 Step response (a) and impulse response (b) of the tank approxima...
Figure 12.10 Closed‐loop validation results for liquid volume set‐point trac...
Cover Page
Wiley Series in
Title Page
Copyright
Dedication
Table of Contents
Part 1 Multi‐parametric Optimization
Part II Multi‐parametric Model Predictive Control
Begin Reading
A Appendix for the mp‐MPC Chapter 10
Appendix for the mp‐MPC Chapter 11
Index
WILEY END USER LICENSE AGREEMENT
ii
iii
v
vi
vii
xvii
xviii
xix
xxi
xii
xxiii
19
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
89
90
91
92
93
94
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
157
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
Operations Research and Management Science
Operations Research and Management Science (ORMS) is a broad, interdisciplinary branch of applied mathematics concerned with improving the quality of decisions and processes and is a major component of the global modern movement towards the use of advanced analytics in industry and scientific research. The Wiley Series in Operations Research and Management Science features a broad collection of books that meet the varied needs of researchers, practitioners, policy makers, and students who use or need to improve their use of analytics. Reflecting the wide range of current research within the ORMS community, the Series encompasses application, methodology, and theory and provides coverage of both classical and cutting edge ORMS concepts and developments. Written by recognized international experts in the field, this collection is appropriate for students as well as professionals from private and public sectors including industry, government, and nonprofit organization who are interested in ORMS at a technical level. The Series is comprised of four sections: Analytics; Decision and Risk Analysis; Optimization Models; and Stochastic Models.
Advisory Editors • Stochastic Models
Tava Olsen, The University of Auckland
Raúl Gouet, University of Chile
Founding Series Editor
James J. Cochran, University of Alabama
Analytics
Yang and Lee • Healthcare Analytics: From Data to Knowledge to Healthcare Improvement
Attoh‐Okine • Big Data and Differential Privacy: Analysis Strategies for Railway Track Engineering
Forthcoming Titles
Kong and Zhang • Decision Analytics and Optimization in Disease Prevention and Treatment
Behavioral Research
Donohue, Katok, and Leider • The Handbook of Behavioral Operations
Decision and Risk Analysis
Barron • Game Theory: An Introduction, Second Edition
Brailsford, Churilov, and Dangerfield • Discrete‐Event Simulation and System Dynamics for Management Decision Making
Johnson, Keisler, Solak, Turcotte, Bayram, and Drew • Decision Science for Housing and Community Development: Localized and Evidence‐Based Responses to Distressed Housing and Blighted Communities
Mislick and Nussbaum • Cost Estimation: Methods and Tools
Forthcoming Titles
Aleman and Carter • Healthcare Engineering
Optimization Models
Ghiani, Laporte, and Musmanno • Introduction to Logistics Systems Management, Second Edition
Forthcoming Titles
Tone • Advances in DEA Theory and Applications: With Examples in Forecasting Models
Stochastic Models
Ibe • Random Walk and Diffusion Processes
Forthcoming Titles
Matis • Applied Markov Based Modelling of Random Processes
Efstratios N. Pistikopoulos
Nikolaos A. Diangelakis
Richard Oberdieck
This edition first published 2021
© 2021 John Wiley & Sons, Inc.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.
The right of Efstratios N. Pistikopoulos, Nikolaos A. Diangelakis, and Richard Oberdieck to be identified as the authors of the editorial material in this work has been asserted in accordance with law.
Registered Office
John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA
Editorial Office
111 River Street, Hoboken, NJ 07030, USA
For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.
Wiley also publishes its books in a variety of electronic formats and by print‐on‐demand. Some content that appears in standard print versions of this book may not be available in other formats.
Limit of Liability/Disclaimer of Warranty
While the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
Library of Congress Cataloging‐in‐Publication Data
Names: Pistikopoulos, Efstratios N., author.
Title: Multi‐parametric optimization and control / Efstratios N.
Pistikopoulos, Nikolaos A. Diangelakis, Richard Oberdieck.
Description: First edition. | Hoboken, NJ : Wiley, 2021. | Series: Wiley
series in operations research and management science | Includes
bibliographical references and index.
Identifiers: LCCN 2020024011 (print) | LCCN 2020024012 (ebook) | ISBN
9781119265184 (hardback) | ISBN 9781119265153 (adobe pdf) | ISBN
9781119265191 (epub)
Subjects: LCSH: Mathematical optimization–Computer programs.
Classification: LCC QA402.5 .P558 2021 (print) | LCC QA402.5 (ebook) |
DDC 519.7–dc23
LC record available at https://lccn.loc.gov/2020024011
LC ebook record available at https://lccn.loc.gov/2020024012
Cover Design: Wiley
Cover Image: Courtesy of Professor Pistikopoulos'research group
To the Memory and Legacy of Professor Christodoulos A. Floudas
Professor Pistikopoulos is the Director of the Texas A&M Energy Institute and a TEES Eminent Professor in the Artie McFerrin Department of Chemical Engineering at Texas A&M University. He was a Professor of Chemical Engineering at Imperial College London, UK (1991–2015), and the Director of its Centre for Process Systems Engineering (2002–2009). He holds a PhD degree from Carnegie Mellon University and he worked with Shell Chemicals in Amsterdam before joining Imperial. He has authored or co‐authored over 500 major research publications in the areas of modeling, control and optimization of process, and energy and systems engineering applications, 12 books, and 2 patents. He is a co‐founder of Process Systems Enterprise (PSE) Ltd., a Fellow of AIChE and IChemE and the current Editor‐in‐Chief of Computers & Chemical Engineering. In 2007, he was a co‐recipient of the prestigious MacRobert Award from the Royal Academy of Engineering. In 2012, he was the recipient of the Computing in Chemical Engineering Award of CAST/AIChE. He received the title of Doctor Honoris Causa from the University Politehnica of Bucharest in 2014, and from the University of Pannonia in 2015. In 2013, he was elected Fellow of the Royal Academy of Engineering in the United Kingdom.
Dr. Diangelakis is an Optimization Specialist at Octeract Ltd. in London, UK, a massively parallel global optimization software firm. He was a postdoctoral research associate at Texas A&M University and Texas A&M Energy Institute. He holds a PhD and M.Sc. on Advanced Chemical Engineering from Imperial College London and has been a member of the “Multi‐parametric Optimization and Control Group” since late 2011. He earned his bachelor degree in 2011 from the National Technical University of Athens (NTUA). His main research interests are on the area of optimal receding horizon strategies for chemical and energy processes while simultaneously optimizing their design. For that purpose, he is investigating novel solution methods for classes of non‐linear, robust, and multi‐parametric optimization programming problems. He is the main developer of the PARametric Optimization and Control (PAROC) platform and co‐developer of the Parametric OPtimization (POP) toolbox. In 2016 he was chosen as one of five participants in the “Distinguished Junior Researcher Seminars” in Northwestern University, organized by Prof. Fengqi You. In 2017 he received the third place in EFCE's “Excellence Award in Recognition of Outstanding PhD Thesis on CAPE.” He is the coauthor of 16 peer reviewed articles, 11 conference papers and 3 book chapters.
Richard Oberdieck is a Technical Account Manager at Gurobi Optimization, LLC, one of the leading mathematical optimization software companies. He obtained a bachelor and MSc degrees from ETH Zurich in Switzerland (2009–1013), before pursuing a PhD in Chemical Engineering at Imperial College London, UK, which he completed in 2017. During is PhD, he discovered fundamental properties of multi‐parametric programming problems and implemented them in the Parametric Optimization (POP) toolbox, of which he was the main developer. After using his knowledge in mathematical modeling and optimization in the space of renewable energies at the world leader in offshore wind energy, Ørsted A/S, he is now helping companies around the world unlock business value through mathematical optimization as a Technical Account Manager for Gurobi Optimization, LLC. He has published 21 papers and 2 book chapters, has an h‐index of 11 and was awarded the FICO Decisions Award 2019 in Optimization, Machine Learning and AI.
Many optimization problems involve parameters that are unknown, either because they cannot be measured, or because they represent information about the future (e.g. future state of a system, future disturbance, future demand). Multi‐parametric programming is a technique for the solution of such class of uncertain optimization problems. Through multi‐parametric programming, one can obtain the optimization variables of the problem as a function of the bounded uncertain parameters, and the regions (in the space of the parameters) where these functions are valid.
Theoretic and algorithmic developments on multi‐parametric programming, along with applications in the area of process systems engineering, have been constantly emerging during the last 30 years.
A variety of algorithms for the solution of a range of classes of multi‐parametric programming problems have been developed, with our group publishing over 80 manuscripts, 21 books and book chapters, and 2 patents on the subject. We have further developed a MATLAB© based toolbox, POP©, for the solution of various classes of multi‐parametric programming and a framework, PAROC©, for the development of explicit model predictive controllers.
This book aims to enable fundamental understanding in the areas of multi‐parametric optimization and control. We hope that by the end of the book, the reader will be able to not only understand almost all aspects of multi‐parametric programming, but also judge the key characteristics and particulars of the various techniques developed for different mathematical programming problems, use the tools to solve parametric problems, and finally, develop explicit model predictive controllers.
The book begins with an introduction to the fundamentals of optimization and the basic theories and definitions used in multi‐parametric optimization. Then, two main parts follow, providing a clear distinction between algorithmic developments and their applications in the development of explicit model predictive controllers.
Part I focuses on the algorithmic developments of multi‐parametric programming. It begins with an overview of the basic sensitivity theorem and progresses to describe solution strategies for linear, quadratic, and mixed‐integer multi‐parametric problems. A chapter for the solution of the aforementioned classes of problems in MATLAB© is also included. Part I concludes with developments of multi‐parametric programming for the solution of other classes of problems.
Part II focuses on multi‐parametric model predictive control and its extension for the solution of other control problems. This section concludes with the presentation and applications of PAROC© framework, a framework for the development and closed loop validation of multi‐parametric model predictive controllers.
As this book is the outcome of the research work carried out over the last 30 years at the Centre for Process Systems Engineering of Imperial College London and the Texas A&M Energy Institute of Texas A&M University, many colleagues, former and current PhD students, and post‐doctorate/research associates have been involved in the presented work. While a number of them are involved in this project as co‐authors, we would like to take the opportunity to thank in particular our current research team at Texas A&M Energy Institute, particularly Dr. Styliani Avraamidou and Mr. Iosif Pappas.
We would also like to gratefully acknowledge the financial support kindly provided by our many sponsors, EPSRC, NSF, EU/ERC, DOE/CESMII, DOE/RAPID, Shell, Air Products, Eli‐Lilly, and BP.
Finally, we would like to thank Wiley‐VCH for their enthusiastic support of this effort.
Richard OberdieckCollege Station, October 2019
Efstratios N. PistikopoulosNikolaos A. Diangelakis
In this chapter, the fundamental concepts of mathematical optimization and multi‐parametric programming will be presented. Such concepts will be the foundation towards the development of state‐of‐the‐art multi‐parametric programming strategies and applications, which will appear in this book in the next chapters.
This section presents the idea of convex sets and introduces function convexity. Convexity plays a vital role to establish the required properties which will enable a multi‐parametric solution to hold. In this setting, the following definitions are established.
Consider the points and . Then the line that passes through these points is defined as
The closed line segment joining the points , is defined as:
A set is a convex set, if the closed line segment joining any two points in the set belongs to the set for each such that .
Let and be convex sets defined in . Then
The intersection of
is a convex set.
The summation
of two convex sets is a convex set.
Let
be a real number. The product
is a convex set.
Examples of convex sets include lines, polytopes and polyhedra, and open and closed halfspaces.
Let be a convex subset, and the real function defined in . The function is a convex function if for any ,,
If strict inequality holds in expression (1.3) for , then is a strictly convex function.
Let be a convex subset, and the real function defined in . The function is a concave function if for any ,,
If strict inequality holds in expression (1.4) for , then is a strictly concave function.
Let
be convex functions defined on a convex subset
. Their summation
is convex, and if at least of one
is a strictly convex function, then their summation is strictly convex.
Let a
be a positive number and
be a (strictly) convex function defined in a convex subset
. Then the product
is (strictly) convex.
Let
be a (strictly) convex function defined in
, and
be an increasing convex function defined on the range of
in
. Then, the composite function
defined in
is a (strictly) convex function.
Let
be convex functions defined on a convex subset
. If these functions are bounded from above, their pointwise supremum
is a convex function on
.
Let
be concave functions defined on a convex subset
. If these functions are bounded from below, their pointwise infimum
is a concave function on
.
We introduce the following definitions for the solution of general nonlinear optimization problems:
is called a local minimum if there exists ball of radius around , , such that
is called a global minimum if
A constrained nonlinear optimization problem, which aims to minimize a real valued function subject to the inequality constraints and equality constraints is denoted as
Problem (1.10) is a nonlinear optimization problem, if and only if, at least one of is a nonlinear function. We assume that the aforementioned functions are continuous and differentiable.
An inequality constraint is called active at a point if . Conversely, is called inactive if .
If one step of the dual simplex algorithm consists of changing one element of the active set, i.e. let , then the dual pivot involving the constraint yields .
The first‐order constraint qualifications that will be presented in the following text are necessary prerequisites to identify whether a feasible point is a local optimum of the function .
Linear independence constraint qualification
: The gradients
for all
and
for all
are linearly independent.
Slater constraint qualification
: The constraints
for all
are pseudo‐convex
1
at
, while the constraints
for all
are quasi‐convex or quasi‐concave.
2
In addition, the gradients
are linearly independent and there exists
such that
and
.
Let and be differentiable at a feasible solution , and let have continuous partial derivatives at . In addition, let be the number of active inequality constraints at . Then if one of the aforementioned constraint qualifications hold, there exist Lagrange multipliers such that
These conditions are the Karush–Kuhn–Tucker (KKT) Necessary Conditions and they are the basis for the solution of nonlinear optimization problems.
Consider the sets and . Then, if the following conditions hold:
is pseudo‐convex at
with respect to all other feasible points x.
for all
are quasi‐convex at
with respect to all other feasible points x.
for all
are quasi‐convex at
with respect to all other feasible points x.
for all
are quasi‐concave at
with respect to all other feasible points x.
then is a global optimum of problem (1.10). If the aforementioned conditions hold only within a ball of radius around , then is a local optimum of problem (1.10).
Consider the following problem:
Let be the global minimum of problem (1.12), and that the gradient of the equality constraints are linearly independent. In addition, assume that the corresponding Lagrange multiplier is . The vector is a perturbation vector. The solution of problem (1.12) is a function of the perturbation vector along with the multiplier. Hence, the Lagrange function can be written as
