extremal problems of information combining alexei ashikhmin information combining: formulation of...

33
Extremal Problems of Information Combining Alexei Ashikhmin formation Combining: formulation of the prob tual Information Function for he Single Parity Check Codes re Extremal Problems of Information Combinin lutions (with the help of Tchebysheff System or the Single Parity Check Codes work with Yibo Jiang, Ralf Koetter, Andrew

Upload: dulcie-wiggins

Post on 02-Jan-2016

220 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Extremal Problems of Information Combining

Alexei Ashikhmin

Information Combining: formulation of the problem Mutual Information Function for the Single Parity Check Codes More Extremal Problems of Information Combining Solutions (with the help of Tchebysheff Systems) for the Single Parity Check Codes

Joint work with Yibo Jiang, Ralf Koetter, Andrew Singer

Page 2: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Encoder

Channel

Channel

Channel

APPDecoder

Information Transmission

Density function of the channel is not known

We only know

Page 3: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Optimization Problem

We assume that

and that the channel is symmetric

Problem 1

Among all probability distributions such that

determine the probability distribution that maximizes (minimizes)

the mutual information at the output of the optimal decoder

Page 4: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Var

iabl

e no

des

proc

essi

ng

Che

ck n

odes

pro

cess

ing

Interleaver

Inpu

t fro

m c

hann

el

From variable nodes

To variable nodes Decoder of single parity check code

Page 5: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Problem is Solved Already

1. I.Land, P. Hoeher, S.Huettinger, J. Huber, 2003

2. I.Sutskover, S. Shamai, J. Ziv, 2003

Page 6: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

erasure

Repetition code: The Binary Erasure Channel (BEC) is the best

The Binary Symmetric Channel (BSC) is the worst

Single Parity Check Code:

is Dual of Repetition Code The Binary Erasure Channel (BEC) is the worst

The Binary Symmetric Channel (BSC) is the best

Page 7: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Our Goals

We would like to solve the optimization problem for the Single Parity Check Codes directly (without using duality)

Get some improvements

Page 8: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Soft Bits

We call soft bit, it has support on

Channel

Page 9: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

erasure

Page 10: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Binary symmetric channel,

Gaussian Channel:

Page 11: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Channel

Channel

Channel

Decoder Single ParityCheck Code

Encoder Single ParityCheck Code

E.Sharon, A. Ashikhmin, S. Litsyn Results:

Page 12: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Properties of the moments

Lemma 1. is nonnegative and nonincreasing

2. The ratio sequence is nonincreasing

Lemma

In the Binary Erasure Channel all moments are the same

Page 13: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Problem 2

Among all T-consistent probability distributions on [0,1]

such that

determine the probability distribution that maximizes

(minimizes) the second moment

Page 14: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Solution of Problem 2

Theorem

Among all binary-input symmetric-output channel

distributions with a fixed mutual information

Binary Symmetric Channel maximizes

and

Binary Erasure Channel minimizes

the second moment

Proof: We use the theory of Tchebysheff Systems

Page 15: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Lemma

Binary Symmetric, Binary Erasure and an arbitrary channel

with the same mutual information have the following layout of

Page 16: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Lemma

Let and

1)

2) if for and for

then

Page 17: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

This is exactly our case

satisfy conditions of the previous lemma

Page 18: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Problem 1 on extremum of mutual information

and

Problem 2 on extremum of the second moment

are equivalent

Page 19: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Extrema of MMSE

It is known that the channel soft bit is the MMSE estimator fo

the channel input

Theorem Among all binary-input symmetric-output channels withfixed the Binary Symmetric Channel has the minimum MMSE: and the Binary Erasure Channel has the maximum MMSE:

Channel

Page 20: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

How good the bounds are

Page 21: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Problem 3

1)

2)

Among all T-consistent channels find that maximizes

(minimizes)

Channel

Channel

Channel

Decoder Single ParityCheck Code

Encoder Single ParityCheck Code

Page 22: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Problem 4

Among all T-consistent probability distributions on [0,1]

such that

1)

2)

determine the probability distribution that maximizes

(minimizes) the fourth moment

Page 23: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Theorem The distribution with mass at , mass at

and mass at 0 maximizes

The distribution with mass at , mass at

and mass at 1 minimizes

Page 24: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Extremum densities

Maximizing

Minimizing:

Page 25: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Lemma

Channel with minimum and maximum and an arbitrary

channel with the same mutual information have the followin

layout of

Page 26: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Problem 3 on extremum of mutual information

and

Problem 4 on extremum of the fourth moment

are equivalent

Page 27: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Assume that

and is the same as in AWGN channel with this

Page 28: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Tchebysheff Systems

Definition

A set of real continues functions is called

Tchebysheff system (T-system) if for any real the linear

combination has at most distinct roots at

Definition

A distribution is a nondecreasing, right-continues function

The moment space, defined by

( is the set of valid distributions), is a closed convex cone.

For define

Page 29: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Problem For a given find

that maximizes (minimizes)

Page 30: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Theorem

If and are T-systems,

and then the extrema are attained uniquely with

distrtibutions and with finitely many mass points

Lower principalrepresentation

Upper principalrepresentation

Page 31: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Soft Bits

We call soft bit, it has support on

Lemma (Sharon, Ashikhmin, Litsyn)

If then

Channel

Random variables with this property are called T-consistent

Page 32: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Find extrema of

Under constrains

Page 33: Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single

Theorem

Systems and are T-systems on [0,1].

---------------------------------------------------------------------------------

the distribution that maximizes

has only one mass point at :

has probability mass at

and at

This is exactly the Binary Symmetric Channel