m s. sandeep pradhan university of michigan, ann arbor joint work with k. ramchandran univ. of...

45
M S. Sandeep Pradhan University of Michigan, Ann Arbor joint work with K. Ramchandran Univ. of California, Berkeley A Comprehensive view of duality in multiuser source coding and channel coding

Upload: percival-fitzgerald

Post on 03-Jan-2016

214 views

Category:

Documents


1 download

TRANSCRIPT

M

S. Sandeep PradhanUniversity of Michigan, Ann Arbor

joint work with K. RamchandranUniv. of California, Berkeley

A Comprehensive view of duality in multiuser source coding and

channel coding

Acknowledgements:

Jim Chou, Univ. of California

Phillip Chou, Microsoft Research

David Tse, Univ. of California

Pramod Viswanath, Univ. of Illinois

Michael Gastpar, Univ. of California

Prakash Ishwar, Univ. of California

Martin Vetterli, EPFL

Outline Motivation, related work and background

Duality between source and channel coding– Role of source distortion measure & channel cost measure

Extension to the case of side information

MIMO source coding and channel coding with one-sided collaboration

Future work: Extensions to multiuser joint source-channel coding

Conclusions

Motivation

• Expanding applications of MIMO source and channel coding

• Explore a unifying thread to these diverse problems

• We consider SCSI and CCSI as functional duals

• We consider 1. Distributed source coding2. Broadcast channel coding3. Multiple description source coding 4. Multiple access channel coding

Functional dual

Functional dual

It all starts with Shannon

“There is a curious and provocative duality between the properties of a source with a distortion measure and those of a channel. This duality is enhanced if we consider channels in which there is a “cost” associatedwith the different input letters, and it is desired to find the capacity subject to the constraint that the expected cost not exceed a certain quantity…..”

Related work (incomplete list)

•Duality between source coding and channel coding:

•Shannon (1959)

•Csiszar and Korner (textbook, 1981)

•Cover & Thomas (textbook: 1991): covering vs. packing

•Eyuboglu and Forney (1993): quantizing vs. modulation: boundary/granular gains vs. shaping/coding gains

•Laroia, Farvardin & Tretter (1994): SVQ versus shell mapping

•Duality between source coding with side information (SCSI) and channel

coding with side information (CCSI):

•Chou, Pradhan & Ramchandran (1999)

•Barron, Wornell and Chen (2000)

•Su, Eggers & Girod (2000)

•Cover and Chiang (2001)

Notation: Source coding:

EncoderXX

DecoderX^

Source alphabetDistributionReconstruction alphabetDistortion measure Distortion constraint D: Encoder: Decoder

)(xpX

XXX ˆ:)ˆ,( xxd

DxxEd )ˆ,(

Rate-distortion function R(D)= )ˆ;()|ˆ(

minXXI

xxp

LLR X}2,....,2,1{ }2,...,2,1{ LRLX

Minimum rate of representing X with distortion D:

Channel coding:

Encoderm

Decoderm^

Input and output alphabets , Conditional distribution Cost measure Cost constraint W: Encoder: Decoder

)ˆ|( xxpXX

X:)ˆ(xwWxEw )ˆ(

Capacity-cost function C(W)= )ˆ;()ˆ(

maxXXI

xp

LLR X}2,....,2,1{ }2,...,2,1{ LRLX

Maximum rate of communication with cost W:

ChannelXX

Source encoder and channel decoder have mapping with the same domain and range.

Similarly, channel encoder and source decoder have the same domain and range.

Gastpar, Rimoldi & Vetterli ’00: To code or not to code?

Encoder Channel DecoderS X Y S

Source: p(s)Channel: p(y|x)

For a given pair of p(s) and p(y|x), there exist a distortion measure and a cost measure such that uncoded mappings at the encoder and decoder are optimal in terms of end-to-end achievable performance.

)ˆ,( ssd )(xw

Encoder: f(.)Decoder: g(.)

Bottom line: Any source can be “matched” optimally to any channel if you are allowed to pick the distortion & cost measures for the source & channel.

Inspiration for cost function/distortion measure analysis:

XX)(Xp

)|ˆ( XXpQuantizer

Role of distortion measures: (Fact 1)

Given a source: Let be some arbitrary quantizer. Then there exists a distortion measure such that:

and

Bottom line: any given quantizer is the optimal quantizer for any source provided

you are allowed to pick the distortion measure

)|ˆ( XXp)ˆ,( xxd

)|ˆ( XXp)(Xp

)(Xp

)ˆ;()ˆ,(),(~:)|ˆ(

minarg)|ˆ(' XXI

DxxEdxpXxxpxxp

)()ˆ|('log)ˆ,( xxxpcxxd

Given a channel: Let be some arbitrary input distribution. Then there exists a cost measure such that:

and

Bottom line: any given input distribution is the optimal input for any channel provided

you are allowed to pick the cost measure

XX)ˆ(' Xp

)ˆ|( XXp

Channel

)ˆ(Xp)ˆ(xw

)|ˆ( XXp)(' Xp

)ˆ|( XXp

Role of cost measures: (Fact 2)

))('||)ˆ|(()ˆ( xpxxpcDxw

)ˆ;()ˆ(),ˆ|(~)ˆ|(:)ˆ(

maxarg)ˆ(' XXI

WxEwxxpXXxpxp

Now we are ready to characterize duality

Theorem 1a: For a given source coding problem with source distortion measure , distortion constraint D, let the optimal quantizer be

inducing the distributions (using Bayes’ rule):

)ˆ,( xxd

;)|ˆ(*)(

)()|ˆ(*)ˆ|(* __

x

xxpxp

xpxxpxxp

x

xxpxpxp )|ˆ(*)()ˆ(*

)(Xp

)ˆ;()ˆ,(),(~:)|ˆ(

minarg)|ˆ(* XXI

DxxEdxpXxxpxxp

)(Xp

)|ˆ(* XXp

OptimalQuantizer )ˆ(* Xp

X X

Duality between classical source and channel coding:

Then a unique dual channel coding problem with channel

input alphabet , output alphabet X, cost measure

and cost constraint W, such that:

(i) R(D)=C(W);

(ii)

),ˆ|(* xxp

X ),ˆ(xw

),ˆ;(maxarg)ˆ(*),ˆ|*(~ˆ|:)ˆ(

XXIxpWEwxxpXXxp

))(||)ˆ|(*()ˆ( 1 xpxxpDcxwwhere and ).ˆ()ˆ*( XwEW xp

)(Xp

)|ˆ(* XXp

OptimalQuantizer )ˆ(* Xp

X X

)(Xp

)ˆ|(* XXp

Channel )ˆ(* Xp

X X

REVERSAL OF ORDER

Interpretation of functional duality

For a given source coding problem, we can associate a specific channel coding problem such that

• both problems induce the same optimal joint distribution

• the optimal encoder for one is functionally identical to the optimal decoder for the other in the limit of large block length

• an appropriate channel-cost measure is associated

)ˆ,(* xxp

Source coding: distortion measure is as important as the source distribution

Channel coding: cost measure is as important as the channel conditional distribution

Source coding with side information:

Encoder DecoderX

S

X^

•The encoder needs to compress the source X.•The decoder has access to correlated side information S. •Studied by Slepian-Wolf ‘73, Wyner-Ziv ’76 Berger ’77•Applications: sensor networks, digital upgrade,

diversity coding for packet networks

EncoderX

S

X^

• Encoder has access to some information S related to the statistical

nature of the channel.• Encoder wishes to communicate over this cost-constrained channel• Studied by Gelfand-Pinsker ‘81, Costa ‘83, Heegard-El Gamal ‘85• Applications: watermarking, data hiding, precoding for known interference, multiantenna broadcast channels.

Channel Decoder

Channel coding with side information:

m m

Duality (loose sense)

CCSI Side information at

encoder only Channel code is

“partitioned” into a bank of source codes

SCSI Side info. at decoder

only Source code is

“partitioned” into a bank of channel codes

Conditional source Side information Context-dependent distortion measure Encoder Decoder

Source coding with side information at decoder (SCSI): (Wyner-Ziv ’76)

S),ˆ,( sxxd

)|( sxp

}2,..,2,1{: RLLXf LLRL XSg ˆ}2,..,2,1{:

)(sp EncoderX

DecoderXU U

Rate-distortion function: )];();([),|ˆ(),|(

min)(* USIUXI

suxpxupDR

such that DXXEdXSUXUXS S )ˆ,(&)ˆ),((),(

Intuition (natural Markov chains):• side information S is not present at the encoder • source X is not present at the decoder

)( UXS )ˆ},{( XSUX

),|ˆ()|()|()(),ˆ,,(* ** usxpxupsxpspuxsxp

Completely determines the optimal joint distribution

Note:

^

SCSI: Gaussian example: (reconstruction of (X-S)):

• Conditional source: X=S+V, p(v)~N(0,N)• Side information: p(s)~N(0,Q)• Distortion measure: (mean squared error reconstruction of (x-s)) •

2)ˆ)(()ˆ,( xsxxxdS

DxxEdS )ˆ,(

+ + +X

q

U

S

Z

XX

Encoder Test channel

N

DN

)|(* xup ),|ˆ(* suxp ),ˆ|(* sxxp

Decoder

S

+

(MMSE estimator)

Conditional channel Side information Cost measure Encoder Decoder

Channel coding with side information at encoder (CCSI):

EncoderU Decoder

S

U

),ˆ( sxw

),ˆ|( sxxp

}2,..,2,1{: RLLXg

LLRL XSf ˆ}2,..,2,1{:

Capacity-Cost function: )];();([),|ˆ(),|(

max)(* USIUXI

suxpsupWC

such that WXEwXSUXUSXX S )ˆ(&),ˆ},{(),},ˆ{(

),ˆ|( SXXpX X)(sp

• channel does not care about U• encoder does not have access to

X

Intuition (natural Markov chains):

)ˆ},{( XSUX )},ˆ{( USXX

),ˆ|(),|ˆ()|()(),ˆ,,(* ** sxxpusxpsupspuxsxp

Completely determines the optimal joint distribution

(Gelfand-Pinsker ’81)

CCSI: Gaussian example (known interference):

• Conditional channel: • Side information: • Distortion measure: ( power constraint on ) •

2)ˆ()ˆ( xxwS

DNsxEw ),ˆ(

),0(~)(,ˆ DNzpZSXX ),0(~)( QNsp

x

+q

Decoder

N

DN

)|(* xup

+ +U

S

Z

XX

Channel

),|ˆ(* suxp ),ˆ|( sxxp

EncoderU

+

S

(MMSE precoder)

(Costa ’83)

U XX

Encoder Test channel

X+

q

)|(* xup ),|ˆ(* suxp

Decoder

S

+

S

Z

),ˆ|(* sxxp

+ + +q

)|(* xup

U

Encoder Channel Decoder

SCSI

CCSIN

DN

Theorem 2a:

Given: ,),ˆ,(),(),|( Dxxdspsxp S

),|ˆ(),|( ** suxpxup

).,|ˆ()|()|()(),ˆ,,( *** usxpxupsxpspuxsxp Inducing:

),},ˆ{( XSXU (natural CCSI constraint)

X U X

S

XEncoder Induced test channel

)|(* xup ),|ˆ(* suxp ),ˆ|(* sxxp

Decoder

If :

Find optimal: that minimizes )];();([ USIUXI

),ˆ|(& * sxxp using Bayes’ rule

is satisfied

(i) Rate-distortion bound = capacity-cost bound )(* DR )(* WC

(ii) achieve capacity-cost optimality),|ˆ(),|( ** usxpsup

(iii) and ))ˆ((),())|(||),ˆ|(()ˆ()|ˆ()(

*1 * xwEWssxpsxxpDcxw SsxpspS

Channel= ),ˆ|(* sxxp Side information = )(sp Cost measure= )ˆ(xwS

=> a dual CCSI with

X U X

S

XEncoder Induced test channel

)|(* xup ),|ˆ(* suxp ),ˆ|(* sxxp

Decoder

U X

S

XChannel

),|ˆ(* suxp ),ˆ|(* sxxp

Encoder DecoderU)|(* xup

Cost constraint=W

)(* WC

Enc.X U

SCSI

Dec.

S

XU Dec. U

CCSI

Enc.U

S

X XCh.

Markov chains and duality

XSUX ˆ,

SCSICCSI

p(s,x,u,x)^

UXS XSUX ˆ, USXX ,ˆ

DUALITY

Duality implication: Generalization of Wyner-Ziv no-rate-loss case

CCSI:(Cohen-Lapidoth, 2000, Erez-Shamai-Zamir, 2000) extension of Costa’s result for to arbitrary S with no rate-loss ZSXX ˆ

+ +

S

Z

XXChannel

),ˆ|( sxxp

Encoder DecoderU U

New result: Wyner-Ziv’s no rate loss result can be extended to arbitrary source and side information as long as X=S+V, where V is Gaussian,for MSE distortion measure.

^Encoder

XDecoder

XU U

S

Functional duality in MIMO source and channel coding with one-sided collaboration:

• For ease of illustration, we consider 2-input-2-output system

• Consider only sum-rate, and single distortion/cost measure

• We consider functional duality in the distributional sense

• Future & on-going work: duality in the coding sense.

MIMO source coding with one-sided collaboration:

1X

2X

1X

2X

1M

2M

Encoder-1

Encoder-2

Decoder-1

Decoder-2

TestChannel

1X

2X

Either the encoders or the decoders (but not both) collaborate

MIMO channel coding with one-sided collaboration:

1X

2X

1M

2M

Encoder-1

Encoder-2

Decoder-1

Decoder-2

Channel

1X

2X

1M

2M

Either the encoders or the decoders (but not both) collaborate

Distributed source coding

• Two correlated sources with given joint distribution joint distortion measure• Encoders DO NOT collaborate, Decoders DO collaborate• Problem: For a given joint distortion D, find the minimum sum-rate R• Achievable rate region (Berger ‘77)

),( 21 xxp)ˆ,ˆ,,( 2121 xxxxd

1X

2X

1X

2X

1M

2M

Encoder-1

Encoder-2

Decoder-1

Decoder-2

TestChannel

1X

2X

Distributed source coding:

Achievable sum-rate region:

such that 2211 UXXU

212121ˆˆ XXUUXX

E[d]<D

);();();(min)( 212211 UUIUXIUXIDRDS

1. Two sources can not see each other2. The decoder can not see the source

Broadcast channel coding

• Broadcast channel with a given conditional distribution joint cost measure• Encoders DO collaborate, Decoders DO NOT collaborate• Problem: For a given joint cost W, find the maximum sum-rate R• Achievable rate region (Marton ’79)

)ˆ,ˆ|,( 2121 xxxxp)ˆ,ˆ( 21 xxw

1X

2X

1M

2M

Encoder-1

Encoder-2

Decoder-1

Decoder-2

Channel

1X

2X

1M

2M

Broadcast Channel Coding:

Achievable sum-rate region:

);();();(max)( 212211 UUIUXIUXIWRBC

such that 212121ˆˆ XXXXUU

212121ˆˆ XXUUXX

E[w]<W

1. Channel only cares about i/p2. Encoder does not have the channel o/p

Duality (loose sense) in Distr. Source coding and Broadcast channel

Distributed source coding Collaboration at decoder

only Uses Wyner-Ziv coding:

source code is “partitioned” into a bank of channel codes

Broadcast channel coding Collaboration at encoder

only Uses Gelfand-Pinsker

coding: channel code is “partitioned” into a bank of source codes

Dist. Source CodingBroadcastChannel Coding

DUALITY2211 UXXU

212121ˆˆ XXUUXX

212121ˆˆ XXXXUU

212121ˆˆ XXUUXX

Theorem 3a:

)ˆ,ˆ,,,,( 212121 XXUUXXp

Example: 2-in-2-out Gaussian Linear Channel: (Caire, Shamai, Yu, Cioffi, Viswanath, Tse)

H++

1X

2X

1N

2N

1X

2X

powerSum

• Marton’s sum-rate is shown to be tight

• Using Sato’s bound => the capacity of Broadcast channel depends only on marginals.

•For optimal i/p distribution, if we keep the variance of the noise the same and change the correlation,at one point we get (also called worst-case noise) .

2211 UXXU

At this point we have duality!

,)ˆˆ()ˆ,ˆ( 22

2121 xxxxw

Multiple access channel coding with independent message sets

1X

2X

1M

2M

Encoder-1

Encoder-2

Decoder-1

Decoder-2

• Multiple access channel with a given conditional distribution joint cost measure• Encoders DO NOT collaborate, Decoders DO collaborate• Problem: For a given joint cost W, find the maximum sum-rate R• Capacity-cost function (Ahlswede ’71):

)ˆ,ˆ|,( 2121 xxxxp)ˆ,ˆ( 21 xxw

such that are independent

WwE ][

Channel

1X

2X

1M

2M

)ˆ,ˆ;(max)( 2121 XXXXIWCMA

21 ˆ,ˆ xx

Multiple description source coding problem:1X

Encoder

Decoder-1

Decoder-2

Decoder-0X 0X

2X

1M

2M

Encoder

Decoder-1

Decoder-2

Decoder-0X 0X

2X

1M

2M

1X

Another version with essentially the same coding techniques,which is “amenable” to duality:

“Multiple Description Source Coding with no-excess sum-rate”

1X

2X

1X

2X

1M

2M

Encoder-1

Encoder-2

Decoder-1

Decoder-2

• Two correlated sources with given joint distribution joint distortion measure• Encoders DO collaborate, Decoders DO NOT collaborate• Problem: For a given joint distortion D, find the minimum sum-rate R• Rate-distortion region (Ahlswede ‘85):

),( 21 xxp)ˆ,ˆ,,( 2121 xxxxd

)ˆ,ˆ;(min)( 2121 XXXXIDRMD

such that are independent

DdE ][

TestChannel

1X

2X

21 ˆ,ˆ xx

Duality (loose sense) in Multiple description coding and multiple access channel

MD coding with no excess sum-rate

Collaboration at encoder only

Uses successive refinement strategy

MAC with independent message sets

Collaboration at decoder only

Uses successive cancellation strategy

Theorem 4a: For a multiple description coding with no excess sum-rate with Given:

Source alphabets: Reconstruction alphabets

,),( 21 xxp Dxxxxd ,)ˆ,ˆ,,( 2121

21, xx21 ˆ,ˆ xx

Find the optimal conditional distribution

),|ˆ,ˆ( 2121* xxxxp

Induces ,)ˆ,ˆ( 21* xxp

)ˆ,ˆ|,( 2121* xxxxp

Then there exists a multiple access channel with:

Channel distribution:

Input alphabets: Output alphabets:

21 ˆ,ˆ xx21, xx

)ˆ,ˆ|,( 2121* xxxxp

Joint cost measure: )ˆ,ˆ( 21 xxw

1) sum-rate-distortion bound sum capacity-cost bound)()( WCDR MAMD

)ˆ,ˆ;(max)ˆ,ˆ;(min 21212121 XXXXIXXXXI

2) achieve optimality for this MA channel coding problem

,)ˆ,ˆ( 21* xxp )ˆ,ˆ|,( 2121

* xxxxp

3) Joint cost measure is

)),(||)ˆ,ˆ|,(()ˆ,ˆ( 212121*

121 xxpxxxxpDcxxw

Similarly, for a given MA channel coding problem with independent messagesets => a dual MD source coding problem with no excess sum-rate.

Example: Given a MA channel: N,XHX ˆ

H++

1X

2X

1N

2N

1X

2X

,)( IN Cov

,)ˆˆ()ˆ,ˆ( 22

2121 xxxxwpowerSum

PW 2

1

1

H

Sum-Capacity optimization: 2222 411log2

1)2( PPPCMA

=> ,)()( IHHX T PCov

H1X

2X++

1N

2N

1X

2XA

++

1Z

2Z

1X

2X

,GaussianN

Channel Decoder

,)ˆ( IX PCov =>

Dual MD coding problem:

,)()( IPCov THHX

)ˆ()ˆ()ˆ( xH-xxH-xxx, TddistortionQuadratic

GaussianSource X

H1X

2X

++

1N

2N

1X

2XA

++

1Z

2Z

1X

2X

Encoder Test Channel

What is addressed in this work:

• Duality in empirical per-letter distributions• Extension of Wyner-Ziv no-rate loss result to more arbitrary cases• Underlying connection between 4 multiuser communication problems

What is left to be addressed:

• Duality in optimal source codes and channel codes• Rate-loss in dual problems• Joint source-channel coding in dual problems

Conclusions

• Distributional relationship between MIMO source & channel coding

• Functional characterization: swappable encoder and decoder codebooks

• Highlighted the importance of source distortion and channel cost measures

• Cross-leveraging of advances in the applications of these fields