time discretization (it’s about time…) sauro succi

Post on 29-Jan-2016

247 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Time Discretization(It’s about time…)

Sauro Succi

Time Evolution

Initial and Boundary Conditions

Time Marching: Formal

Time Marching: Matrix Exponential

L banded b, L*L banded 2b+1, L*L*L banded 3b+2 …. UNPRACTICAL

Numerical Marching

On the lattice x_j=j*d:

Numerical Marching

On the lattice x_j=j*d:

Space-Time Connectivity

Numerical Marching

Explicit

Crank-Nicolson

Fully Implicit

Unconditionally stable --> large dt

Non-local, matrix algebra, expensive

Time marching: Explicit/Implicit methods

Conditionally stable (|1+Ldt|<1)---> small dt

Matrix-free (local in spacetime)Simple and fast

X ----------X----------X

X ----------X----------X

X ----------X----------X X ----------X----------X

X ----------X----------X

Euler fwd

- t -

- t+dt -

- t+2dt -

Cranck-Nicolson

Implicit=Non-local

3 spatial connections

5 spatial connections

2p+1 spatial connections

Implicit Time-Marching

Matrix problem

Implicit Time-Marching

Implicit Time-Marching

Numerical Marching

Explicit

Crank-Nicolson

Forward Euler

Stability (for stable phys)

Crank-Nicolson

Trapezoidal rule

Matrix Algebra

Direct Methods

Iterative Methods

Algebraic Methods

Matrix Inversion

Direct Methods

1. Forward sweep: y

2. Backward sweep: x

Gaussian elimination

Zeroes included!

Multiple Dimensions

Memory is 1d: Addressing

Sequential index:

Physical vs Logical indexing

Bandwidth minimization (d>1)

Splitting

Relaxation

Gradient

Conjugate Gradient

Iterative: Methods

No matrix inversion: only matrix*vector products

Iterative: Jacobi splitting

Stopping criteria:

Gauss-Seidel: aggressive Jacobi

Convergence criteria:

Jacobi: does it converge?

Diagonal dominant:

Good matrix condition:

Optimum condition=1 Bad condition >>1

Alternating Direction Implicit

Directional Splitting

A sequence of Ny one-dimensionalProblems of size NX:

Over/Under relaxation methods

Accelerated Relaxation

Conservative, oscillating convergence

Aggressive, montonic convergence

Gradient methods

Relaxation methods

Fictitious time

Iteration loop

Convergence criterion:

Usually very slow!

Steepest descent: gradient

Residual=Gradient of the energy

r is the gradient ofE=1/2<x,Ax>-<b,x>

Steepest Descent

Convergent guaranteed only for SPD matrices [(x,Ax)>0]

Long-term slow because r-->0 as Ax-->b

Round-off sensitive, needle-sensitive

Conjugate Gradient

The steepest descent is not the shortest path to the minimum

g=r= Ax-b; <g,t>=0

c defined by <c,At> = 0

One shot on the local minimum at each plane x_k,x_(k+1)

tgc

x_k

x_(k+1)

x

Conjugate Gradient

The steepest descent is not the shortest path to the minimum!

0. x0, p0=r0=A*x0-b The new search direction is between r and old p1. a0=<r0,p0>/<r0,A*p0>

2. x1=x0+a0*p0; r1=r0-a0*A*p0 Out? 3. b0=<r1*r1>/<r0*r0> p1 = r1+b0*p0 next search direction based upon <p1,A*p0>=0 4. Go to 1 with p1

t

gc

x

{p0,A*p0,A^2*p0 … A^N p0} is a Krylov sequence: N step convergent for SPD

End of Lecture

Time Marching: Spectral

Hermitian Stable/Unstable

Example: diffusion

Operator Splitting (useful in d>1)

Non-commuting propagators

Trotter splitting

top related