sauernumanalyism 286855 14 ch13

Upload: elvarg09

Post on 02-Jun-2018

219 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/10/2019 SauerNumAnalyISM 286855 14 Ch13

    1/4

    CHAPTER 13Optimization

    Exercises 13.1

    1 (a) The derivative is f(x) = ex ex, which is zero if and only ifx = 0. Furthermore,f(x)< 0 for x 0 forx >0. Thereforefis unimodal on(,)and hasan absolute minimum at(x, y) = (0, 2).

    1 (b) The expressionx6

    is positive if and only ifx= 0, and equals zero at x = 0. Therefore(0, 0) is the absolute minimum. The derivative is f(x) = 6x5, which is negative for x < 0and positive forx >0, sofis unimodal on(,).

    1 (c) The derivativef(x) = 8x3 + 1 is equal to zero if and only ifx =1/2, is negative forx 1/2. Therefore(1/2,3/8)is the absolute minimumof the unimodal function.

    1 (d) The derivative isf(x) = 1 1/x, which equals zero if and only ifx = 1, is negative for0 < x 1. Thusfis unimodal on(0,)and the absolute minimumoccurs at(1, 1).

    2 (a) (,

    1)2 (b) (1,4)2 (c) (0, 5)2 (d) ( ln 2, 2 2ln2)

    Computer Problems 13.1

    1 (a) The plot shows that the interval[0, 1]contains a relative minimum. According to Theorem13.2, the numberk of Golden Section Search steps needed satisfies

    gk(1

    0)

    2 > x=gss(inline(2*x4+3*x2-4*x+5),0,1,24)

    results in convergence to the minimumx= 1/2.1 (b) From the plot below, there are two relative minima. The intervals[2.5,1.5]and[0.5, 1.5]

    each contain a minimum. The number of steps needed for each interval is 24, as in (a). The

    minimax=

    2andx = 1are found by gss.

  • 8/10/2019 SauerNumAnalyISM 286855 14 Ch13

    2/4

    278 CHAPTER 13 OPTIMIZATION

    1 (c) Similar to (a). The interval[0, 1]contains a relative minimum. Applying 24 steps ofgss

    gives the approximationx= 0.47033.1 (d) Similar to (a). The interval[1, 2]contains a relative minimum. Applying 24 steps ofgss

    provides the approximationx= 1.43791.

    3 2 1 0 1 2 3

    1

    2

    3

    4

    5

    6

    7

    8

    (a)

    3 2 1 1 2 3

    20

    10

    5

    (b)

    2 1 1 2

    6

    4

    2

    2

    (c)

    2 1 1 2

    20

    15

    10

    5

    (d)

    2 (a) 1/22 (b) 2, 12 (c) 0.470332 (d) 1.43791

    3 (a) The squared distance from the point(x, 1/x)to (2, 3)is D(x) = (x 2)2 + (1/x 3)2.The derivative is

    D(x) = 2x 4 2x3

    + 6

    x2.

    Newtons Method applied to find a root ofD (x) converges tor = 0.358555, correspondingto a distance of2.788973.

    3 (b) Applying Golden Section Search on the interval[0, 1]

    >> x=gss(inline((x-2)2+(1/x-3)2),0,1,30)

  • 8/10/2019 SauerNumAnalyISM 286855 14 Ch13

    3/4

    Section 13.2 279

    producesx= 0.358555as in part (a).

    4 (a) Newtons Method finds the maximum distance point to be(0.335281,0.628079).4 (b) Golden Section Search finds the same point as in part (a).

    5 Program 13.3 can be applied as

    x=neldermead(inline(exp(-x(1)2 *x(2)2)+(x(1)-1)2+(x(2)-1)2),[1;1],1,60)

    to converge to(1.20881759, 1.20881759) within8 decimal places.

    6 (a) (1.132638,0.465972), (0.465972, 1.132638)6 (b)

    (0.67633357728, 0.67633357728)

    7 Program 13.3 can be applied as

    x=neldermead(inline(100 *(x(2)-x(1)2)2+(x(1)-1)2),[0;0],1,100)

    to converge to(1, 1).

    Exercises 13.2

    1 Minimum is(1.2088176, 1.2088176). Different initial conditions will yield answers that differby about1/2.

    2 (a) (1.132638,0.465972), (0.465972, 1.132638)2 (b) (0.6763, 0.6763)3 (a) Newtons Method when applied to the gradient of the Rosenbrock functionF(x1, x2) =

    100(x2x21)2 + (x11)2 converges to the minimum(x1, x2) = (1, 1). Newtons method willbe accurate to machine precision since it is finding a simple root.

    3 (b) Steepest Descent also converges to(1, 1), but about 8 digits of accuracy in double precision,since error is of size 1/2.

    4 (a) (1.132638,0.465972), (0.465972, 1.132638)4 (b) (0.6763, 0.6763)5 (a) Implement Conjugate Gradient Search as on page 596, using Successive Parabolic Inter-

    polation as the one-dimensional minimizer. Using initial guess(1,1), the method convergesto the minimum(1.132638,0.465972). Using initial guess(1, 1), the method converges tothe minimum(0.465972, 1.132638).

    5 (b) Similar to (a). Conjugate Gradient Search with SPI converges, depending on initial guess,

    to the two minima(0.6763, 0.6763)and(0.6763,0.6763).6 (a) (1.29034,0.41771)6 (b) (

    0.84545,

    0.70095)

  • 8/10/2019 SauerNumAnalyISM 286855 14 Ch13

    4/4