information theory multi-user information theory a.j. han vinck essen, 2004

31
Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Upload: garrison-cunnington

Post on 14-Dec-2015

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Information theory

Multi-user information theory

A.J. Han VinckEssen, 2004

Page 2: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

content

Some examples of channels Additive coding for the broadcasting Superposition coding for multi-access Coding for the two-way channel Coding for the switching channel Some more

Page 3: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Goal of the lectures:

Introduction of some classical models two-way; two access; broadcast;

Problems connected: calculation and formulation of capacity Development of coding strategies

Page 4: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Time Sharing (TDMA)

User 1

User 2

User 3

Time sharing: easy to organize

inefficient if not many users active

efficiency depends on channel

message

idleCommon channel

Page 5: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Two-way

X1 X2

Y1 Y2

X1 and X2 communicate by observing Y1 and Y2

R1= I(X1;Y2|X2)

R2= I(X2;Y1|X1)

Maximize (R1,R2) over any input

distribution P(X1,X2)

I(X1;Y2|X2) := H(X1|X2)-H(X1|X2,Y2)

Page 6: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Note:

I(X1;Y2|X2) := H(X1|X2)-H(X1|X2,Y2)

H(X1|X2) = minimum average # bits needed to specify X1 given X2

H(X1|X2,Y2) = minimum average # bits needed to specify X1 given X2

and the observation Y2

Difference = what we learned from the transmission over the channel

= the reduction in average specification length of X1|X2

Page 7: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Example: AND channel

X1 X2

Y Y

X1

0 1

X2 0 0 0

1 0 1

Y

When X1 = 0, he does not know X2

X1 = 1, he knows X2

Same for X2

Page 8: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

A coding example

X1 01 10

01 01 00

X2

10 00 1

if y = 0

Transmit inverse (red)

If y = 0

Inputs are knownRate: 1/( 2*3/4 + 1*¼) = 4/7 = 0.57 > 1 !!

X1 X2

Y

Page 9: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Another coding example

X1

0 11 0 10 0 0

X2 0 10 1 01 1 00

0 0 1 00 1 1

Rate: log23/( 2*3/9 + 3 * 6/9 ) = .59 > 1 !!

X1 X2

Y

Page 10: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

dependent inputs X1 and X2

P(X1=0, X2=0) = 0

P(X1=0, X2=1) = P(X1=1, X2=0) = p

P(X1=1,X2=1) = 1-2p

Then, P(X1=1) = P(X1=1, X2=0) + P(X1=1, X2=1) = 1-p.

R2 = R1 = I(X2;Y|X1) = I(X1;Y|X2)

= H(Y|X1) = (1-p)h(p/(1-p))

The maximum = 0.694

0 1

0 p

1 p 1-

2p

Page 11: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Note:

P(Y=0|X1=1) = P(Y=0, X1=1)/P(X1=1) = p/(1-p)

P(Y=1|X1=1) = P(Y=1, X1=1)/P(X1=1) = (1-2p)/(1-p)

Page 12: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

A lower bound

Let X1 and X2 transmit independently

P(X1 = 1) = 1 – P(X1=0) = a

P(X2 = 1) = 1 – P(X2=0) = a

Then: R1 = I(X1;Y|X2) = H(Y|X2) – H(Y|X1,X2)

= ah(a) = R2

The maximum = 0.616 > 4/7

X1 X2

Y

Page 13: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

The upper (outer) bound

The innerinner bound ( for independent transmission) is

< the Shannon outerouter bound ( X1 and X2 dependent).

For R1 = R2 inner rate = 0.616

outer rate 0.694

The exact capacity is unknown!

X1 X2

Y

Page 14: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

bounds

R1

1

1 R2

Outer bound

inner bound

0

X1 X2

Y

Page 15: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Broadcast

X

Z

Y

Z transmits information to X

same information to Y

R1 I( X; Z )

R2 I( Y; Z )

R1 + R2 I ( Z; (X,Y))

= I( Z; X)

Page 16: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

broadcast

X

Z

Y

Z transmits information to X

different information to Y

R1 I( X; Z )

R2 I( Y; Z )

R1 + R2 I( Z; (X,Y) )

= I( Z; X) + I(Z;Y|X)

Page 17: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

example: Blackwell BC

Z X Y

0 0 0

1 0 1

2 1 1

R1 I( X; Z ) = H(X)-H(X|Z)

R2 I( Y; Z ) =H(Y) –H(Y|Z)

R1 + R2 I( Z; (X,Y)) log23

X

Z

Y

Page 18: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

example

Y

00/11 01/10

00 00 10

X 01 12 02

10 21 20

Z

Z X Y

0 0 0

1 0 1

2 1 1

I(Y;Z) = 1

I(X;Z) = log23

Rsum = (1+ log23)/2

= 1.29 bit/tr.

X

Z

Y

Page 19: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

2-access channel

X1

X2Y

X1 and X2 want to communicate with Y at the same time!

Obvious bound on the sum rate:

R1+R2 H(Y) – H(Y|X1,X2) H(Y)

Page 20: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Two-access models

Switching two-Adder

x1 x2 y x1 x2 y

00 00 0 01 01 1 10 0 10 1 11 1 11 2

Y Y

X1

X2

X1

X2y y

0 1

1 2

0

1

Page 21: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Two-adder (1)

Capacity region

0

1

1

2

X1 0 1

0

X2 1

X1

X2

Y

R1 1 from X1 to Y

R2 1 from X2 to Y

R1+R2 H(Y) 1.5

0 .5 1 R2

R1

1

0.5

0

timesharing

Page 22: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Two-adder (2)

Coding strategy: -error

User 1: transmit at rate R = 1 bit, i.e. P(X1 = 0) = ½

User 2: sees erasure channel.

0

User 2

1

0

1

2

.5

.5

.5

.5

Max H(X)-H(X|Y) = ½

Hence:

rate pair (R1,R2) = ( 1, ½ )

X1

X2

Y

Page 23: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Two-adder (3)

A simple a-symmetric strategy: 0-error

X1

X2 000 001 010 011 100 101 110

000 000 001 010 011 100 101 110

111 111 112 121 122 211 212 221

Efficiency = 1/3 log214 = 1.27

X1

X2

Y

Page 24: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Two-adder with feedback (1)

Question: can we enlarge the capacity region?

Yes (Wolf)! Is this a surprise?R1

1

0.5

0 0 .5 1 R2

Capacity region characterization:

modified by Cover-Leung

Willems

X1

X2

Y

Page 25: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Two-adder with feedback (2)

0

1

1

2

0 1

0

1

X1

X2

N independent transmissions

2

Nuncertainty Solve uncertainty

in steps 3log/2

N2

Total efficiency: = 1.52!3log2

2

2

NN

N

Joint output selection

X1

X2

Y

Page 26: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Switching channel (1)

X1 0 1

X2 0 0

1 1

Y

X2={0,1}

X1={0,1}

{,0,1}

tri-state logic

P( \ pass info ) = ( 1-a, a )

Rsum(max) = a + h(a) log23

Page 27: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

simple coding example (2)

For n = 2:

code 1: 11 10 01

code 2 00 00 0 0

11 11 1 1

Sum rate: Rsum = (log23 + 1)/2 = 1.3

X2={0,1}

X1={0,1}

{,0,1}

Page 28: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Switching channel with general coding (3)

Strategy: code 1 (0 0 1 0 ... 0 1 0 ) < dmin zeros

Linear code 2 (0 1 1 0 ... 0 0 0 ) k n – dmin+ 1

receive: ( 1 ... 0 )

correct dmin - 1 erasures

PERFORMANCE: = C!!= C!! !Chn

n

n

1dnR n

mindminsum

n

Page 29: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

Extensions (2)

T-user ADDER +{0,1}

{0,1}

{0,1}

{0,1, •••, T}•••

Page 30: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

T-user ADDER + (1)

Input output

User 1 { 00 11} { 10 01 11 20 21 31 12 22 }

User 2 { 10 01}

User 3 { 00 10}

Efficiency: 3 * ½ = 1.5

Page 31: Information theory Multi-user information theory A.J. Han Vinck Essen, 2004

T-user ADDER + (2)

Input

User 1 { 000 111}

User 2 { 110 001}

User 3 { 000 001 010 100 101 011}

Outputs: { 110 111 120 210 211 121 etc. }

Efficiency = ( 2 + log26 )/3 = 1.53 bits/tr.

record: 1.551 by van Tilborg (1991)