convex optmization in communications
TRANSCRIPT
outline1. Introduction2. Historical development3. Classification of optimization4. Convex optimization5. Subclasses of convex optimization6. Advanced optimization methods7. Applications8. conclusion
IntroductionOptimization is finding an alternative with
the most cost effective or highest achievable performance under the given constraints.
Why Optimization is necessary?Minimum effortSave timeReduce cost & errorsEfficient
ExamplesPortfolio optimization • objective: overall risk or return
variance •variables: amounts invested in
different assets • constraints: budget, max./min.
investment per asset, return Device sizing in electronic circuits • objective: Minimize power
consumption • variables: device widths and lengths • constraints: manufacturing limits,
timing requirements, maximum area
Historical developmentGeorge Bernard Dantzig (Linear programming and Simplex method
(1947))Harold William Kuhn (Necessary and sufficient conditions for the
optimal solution of programming problems)Albert William Tucker (Necessary and sufficient conditions for the
optimal solution of programming problems, nonlinear programming)
Classification of optimization
optimization
convex concave
Convex optimizationConvex function?
Example: f(x)=x2 is convex since f’(x)=2x, f’’(x)=2>0
xxa xb
f(x)
f x( ) 0
Convex setA convex set is a set of points such that, given
any two points A, B in that set, the line AB joining them lies entirely within that set.
Convex set Non convex set
convex optimization problemA convex optimization problem is one of the form Minimize f0(x)
Subject to = 0 gi(X)≤ 0, i =
1, . . . ,m.x : optimization variable f0 : objective functionfi & gi : constraints
Constraints:constraints
Unconstrained minimization
Equality constrained minimization
In Equality constrained minimization
Unconstrained minimization :Least squares
minimize solving least-squares problems • Analytical solution: x*= b • a mature technology using least-squaresRegression analysis, statistical estimation
problemstandard techniquesWeighted least squares, Regularization
Equality constrained minimization
Minimize f(X) Subject to gi(X) =0 , i = 1, 2, …., mThe above function can be solved by using 1. Direct substitution
2 .Constrained variation 3. Lagrange multipliers
Lagrange multipliersFor instance consider the optimization problem minimize f(x, y) subject to g(x, y) = c. We introduce a new variable (λ) called a Lagrange multiplier and
Lagrange function is defined by L(x, y, λ) = f (x, y) + λg(x, y)Steps to solve:Now find the partial derivative with respect to each variable x, y and
the Lagrange multiplier Set each of the partial derivatives equal to zero to get Lx = 0, Ly = 0
and Lλ = 0 Using Lx = 0, Ly = 0, proceed to solve for x and solve for y in terms of λ .
Now substitute the solutions for x and y so that L λ = 0 is in terms of λ only. Now solve for λ and use this value to find the optimal values x and y
In Equality constrained minimization
Introduce slack variable y ^2 (j ), then gj (X) + y ^2 (j ) = 0, j = 1, 2, . . . , mThe problem now becomes Gj (X, Y) = gj (X) + y ^2 (j ) = 0, j =
1, 2, . . . , mwhere Y = {y1, y2, . . . , ym} T is the vector of
slack variablesThis problem can be solved conveniently by
the method of Lagrange multipliers.can be solved by using KKT conditions
SUB CLASSES OF CONVEX OPTIMIZATION
SDP SOCPQP LP Geometric
programming
Convex optimization
LS
Linear programming minimize x
subject to x i= 1, . . . , m solving linear programs • no analytical formula for solution • reliable and efficient algorithms and
software using linear programming • not as easy to recognize as least-squares
problems• Chebyshev approximation problem
Quadratic programmingQuadratic programming problem is of the form
Special case of linear programmingSolution methodsinterior point,Lagrangian,conjugate gradientextensions of the simplex algorithm
Geometric programmingA geometric programming (GP) is
an optimization problem of the form Minimize subject to 1 i=1,2,……m =1 i=1,2,……m
where are posynomials and are monomials.
Applications: • components sizing in IC design, • Power control • parameter estimation via logistic regression in
statistics
Second-order cone programming(SOCP)second-order cone program (SOCP) has form minimize Subject to
i = 1,...,m with variable x ∈
Applications:
Robust linear programming,Filter design
Semidefinite programming(SDP)SDPs are special case of cone
programming All linear programs can be expressed as
SDPs, and via hierarchies of SDPs the solutions of polynomial optimization problems can be approximated.
Semidefinite programming has been used in the optimization of complex systems
they can be used as sophisticated approximations of non-convex problems
Convex optimization hierarchyLeast squares
Linear programming
Quadratic programming
Geometric programming
SOCP
SDP
More
gene
ralMore
efficient
Advanced optimization methodsInterior Point MethodsInterior point methods are a certain class
of algorithms that solves linear and nonlinear convex optimization problems
Reason to develop interior point methods?
Kachiyan in 1979 – Ellipsoid method – running time o( )
Karmarkar in 1984 – projective algorithm - running time o( )
Nesterov and Nemirovski in 1995 – primal dual algoritm - running time o( )
Concave optimization A concave optimization problem is any problem where
the objective or any of the constraints are non-convex or concave.
line segment joining the two points lies entirely below or on the graph of f(x).
Example: f(x) = -8x2
xxa xb
f(x)
f x( ) 0
Convex optimization in wireless communications
1. Pulse shaping filter design2. Transmit beamforming3. Network Resource Allocation4. MMSE precoder design for multi-access
communication5. Robust beamforming6. Optimal linear decentralized estimation
Design of Orthogonal Pulse Shapes for Communications
Objective function:To find a waveform that minimizes the spectral
occupation of the communication scheme Constraint: That the filters are self-orthogonal at
translations of integer multiples of T.Reformulating the problem:By reformulating the design problem in terms of
the autocorrelation sequence of the “pulse-shaping” filter, the translation orthogonality constraints become linear and, hence, convex.
The transformed (autocorrelation design) problem is a convex semidefinite program (SDP) whose globally optimal solution can be found in an efficient manner using interior point methods.
A Multiuser MIMO Transmit Beamformer Based on the Statistics of the Signal-to-Leakage Ratio
Objective function: maximize SLR and minimize
outage probabilitymaximize SLR
minimize outage probability
Pout =pr{SLNRi ≅ Z ≤ yo }
ApplicationsEngineering Managerial economicsFinancePharmaceuticsStatisticsData mining
conclusionThe convexity property can make
optimization in some sense "easier" than the general case - for example, any local minimum must be a global minimum.
With recent improvements in computing and in optimization theory, convex minimization is nearly as straightforward as linear programming.
Many optimization problems can be reformulated as convex minimization problems.
references [1] Boyd, S. and Vandenberghe, L., Convex Optimization, Cambridge
University Press, 2003. [2] Ye, Y., Interior Point Algorithms: Theory and Analysis, Wiley-
Interscience Series in DiscreteMathematics and Optimization, John Wiley & Sons, 1997.
[3] K. Deb., Optimization for Engineering Design: Algorithms and Examples, PHI Pvt Ltd., 1998.
[4] S.S. Rao, Engineering optimization: Theory and Practice, New age international (P) Ltd. 2001
[5] Timothy N. Davidson, Zhi-Quan (Tom) Luo, and Kon Max Wong, “Design of Orthogonal Pulse Shapes for Communications via Semidefinite Programming” IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 48, NO. 5, MAY 2000.
[6] Emil Bjornson, Mats Bengtsson, and Bjorn Ottersten “Optimal Multiuser Transmit Beamforming: A Difficult Problem with a Simple Solution Structure” IEEE Signal Processing Magazine, JULY 2014.
THANK YOU