Nonlinear Opt.: Quadratic Problems 2

Aus Operations-Research-Wiki
Wechseln zu: Navigation, Suche


Theory

In this section of nonlinear optimization we look for problems which differ only their target function F(x).Besides the linear term contain also a quadratic form and/or for some or all i≠j.

We describe some solutions for quadratic maximization problems with a concave target function (and accordingly minimization problems with a convex target function).

At first we define the quadratic form we mean a function which look like:

X is a column vector of the dimension n ; C is a symmetric (n x n)-matrix with real-valued coefficient

So we can convert the quadratic sections of every target function F(x) with a problem in a quadratic optimization like:

In the first instance we write down the quadratic sections with real-valued coefficients



When we symmetrisize the term ( become ), so we can also write :


Fehler beim Erstellen des Vorschaubildes: Die Miniaturansicht konnte nicht am vorgesehenen Ort gespeichert werden


If we sum up the coefficients cij to a symmetric (nxn)-matrix C and the variable xi to a real-valued column vector x, the right side is the product of the row vector with the column vector Cx.

If we act on the assumption from the second notation we frame every quadratic maximization problem with linear side donditions like this :


Restriction:

Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): Ax \le b


Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): x \ge 0


In quadratic minimization problems we use the target function :



Example

There is a business that sells two types of insulation stuff. Therefore they use two materials row wood and recycled wood wool . The costs of recycling old furniture and building timber is higher then the raw wood but the recycling type has to contain twice as much recycling material then raw material. In Sum they can only produce 20 units of mixed recycling insulation and unlimited units of pure wood insulation. The business wants to minimize their costs.

Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): c(x_2)=-2x_1+2x_2


Total costs: Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): c_t (x_1,x_2)= x_1^2 -2x_1x_2 +2x_2^2


Restriction:

Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): x_1,x_2 \ge 0


Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): 2x_1+x_2\le 40


The example shows a optimisation problem with linear restriction, similar to the linear optimisation but a quadratic target function. Following we show how to handle/transform this function.

First we transform the problem in the quadratic form:

Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): c(x_1,x_2)= x_1^2 -2x_1x_2 +2x_2^2


Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): \Rightarrow c(x_1,x_2)= (x_1 -2x_2)x_1)+(2x_2)x_2


Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): \Rightarrow c(x_1,x_2)= (x_1 -x_2)x_1)+(-x_1+x_2)x_2


Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): \Rightarrow c(x_1,x_2)= \frac{1}{2} [(2x_1 -2x_2)x_1)+(-2x_1+4x_2)x_2]


Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): c=\frac{1}{2} \begin{pmatrix} 2&-2 \\ -2&4 \end{pmatrix}


Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): c(x_1,x_2)=\begin{pmatrix} 0 & 0 \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} +\frac{1}{2}\begin{pmatrix} x_1 \\ x_2 \end{pmatrix} \begin{pmatrix} 2&-2 \\ -2&4 \end{pmatrix} \begin{pmatrix} x_1 & x_2 \end{pmatrix}


Secondly we must proof if it´s a konvex or koncav funktion thus if it´s positiv or negativ definite. It should be positiv definite to represent a global minimum costs.

To test the form of the graph, we draw up the Hesse matrix.

Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): \Rightarrow H(x_1,x_2)= \begin{pmatrix} 2&-2 \\ -2&4 \end{pmatrix}


If the main mintors are bigger (or equal) zero, then the matrix is positiv definit (positiv semidefinit) and is is konvex.


Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): det \begin{vmatrix} 2&-2 \\ -2&4 \end{vmatrix} =4


Both mintors are bigger zero and as consequence of konvexity there must be a global minimum of costs. To calculate this, there are diffrent methods e.g. to set the gradient equal to zero and solve it.


Sources

Domschke, Wolfgang, Einführung in Operations Research, Berlin 2005

Domschke, Wolfgang, Übungen und Fallbeispiele zum Operations Research, Berlin 2005



Authors: 369772,380974,375495