capacity for communication channels

86
Capacity for Communication Channels The material is presented at a workshop tutorial at the 43 rd Conference on Decision and Control, Bahamas, December 2004.

Upload: others

Post on 19-Mar-2022

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Capacity for Communication Channels

Capacity for Communication Channels

The material is presented at a workshop tutorial at the 43rd Conference on Decision and Control, Bahamas, December 2004.

Page 2: Capacity for Communication Channels

2

Overview

Importance of Uncertainty in CommunicationsShannon’s Definition of Capacity Capacity of Additive Gaussian Channels

Random Variable CaseRandom Process Case

Capacity of MIMO Gaussian ChannelsReview of Maximin Capacity

Saddle Point Solutions

Page 3: Capacity for Communication Channels

3

Overview

Maximin Capacity Subject to NormedUncertainties

Uncertain ChannelUncertain NoiseUncertain Channel and Noise

Coding TheoremExamples and ConclusionsReferences

Page 4: Capacity for Communication Channels

4

Importance of Communication Subject to Uncertainties

Page 5: Capacity for Communication Channels

5

Importance of Communication Subject to Uncertainties

Channel measurement errorsNetwork operating conditionsChannel modelingCommunication in presence of jammingSensor networks Teleoperations

Page 6: Capacity for Communication Channels

6

Importance of Communication Subject to Uncertainties

Channel measurement errorsProblem of allocation of power and bandwidth for channel soundingAllocation depends on the benefit that can be obtained from more accurate measurement (e.g., when channel changes too rapidly)Feedback (allocation of bandwidth to provide accurate channel state information, robustness of feedback to channel error)Effect on new wireless systems (higher carrier frequency, ultra-wide bandwidth)

Page 7: Capacity for Communication Channels

7

Importance of Communication Subject to Uncertainties

Network operating conditionsUncertainty can impair interference cancellation in multiple access systemsInterplay between physical layer and higher layers (e.g., effect of adaptive coding on congestion)

Channel modeling Models used for developing communication schemes are simple, but the gap between real channel and model can be big

Page 8: Capacity for Communication Channels

8

Importance of Communication Subject to Uncertainties

Communication in presence of jammingUseful transmitted signal is accompanied with adversary signal making communication channel uncertain

Sensor networksTeleoperations

Page 9: Capacity for Communication Channels

9

Shannon’s Definition of Capacity

Page 10: Capacity for Communication Channels

10

Shannon’s Definition of Capacity

Model of communication system

wSource Encoder Decoder SinkChannel +

xn

y wf g

{ }M,...,1 { }M,...,1( )wf ( )( )wfh ( )( )( )wfhg( )nM ,

( )( )( ){ }( ) ∑

=

=

=≠=

=

M

ii

ne

i

MP

Misentiiifhg

nMR

1

1,...,1,|Pr

log

λ

λ

Page 11: Capacity for Communication Channels

11

Shannon’s Definition of Capacity

R is achievable rate if there exists a sequence of codes (M=2nR,n) such that the probability of error tends to zero as n tends to infinity.

The operational capacity is the supremum of all achievable rates.

Page 12: Capacity for Communication Channels

12

Shannon’s Definition of Capacity

Discrete memoryless channel

Channel capacity depends on channel transition matrix Q(y|x) that is known

( )( )

( ) ( ) ( ) ( )( ) ( )∑∑ ∑∈ ∈

=

=

Xx YyXx

XP

xyQxPxyQxyQxPQPI

QPIC

''|'

|log|,

,maxπ

Page 13: Capacity for Communication Channels

13

Shannon’s Definition of Capacity

What if Q(y|x) is unknown ?Example: compound BSC

What is the channel capacity ?

θ

θ

1- θ

1- θ0 0

1 1

{ } [ ]

( )⎩⎨⎧

=−≠

=

⊂Θ==

xyxy

xyQ

YX

,1,

;|

1,0,1,0

θθ

θ

Page 14: Capacity for Communication Channels

14

Shannon’s Definition of Capacity

Model of communication system with CSI

Source Encoder Decoder SinkChannel +x

ny x

CSIu v

Page 15: Capacity for Communication Channels

15

Shannon’s Definition of Capacity

Examples: arbitrary varying channels (AVC), arbitrary varying finite state channels (AVFSC), and others.Discrete memoryless AVC, - the state space

What is the channel capacity ?

( ) ( )∏=t

ttt sxyQsxyQ ;|;|

Sst ∈

Page 16: Capacity for Communication Channels

16

Shannon’s Definition of Capacity

Additive Gaussian ChannelsRandom Variable Case

( )[ ]

⎟⎠⎞

⎜⎝⎛ +=

≤+=

2

2

1log21

,0~,

σ

σ

PC

PxVarNnnxy

( )( ) [ ] PxVaryxIC

XP≤=

∈,;max

π

Page 17: Capacity for Communication Channels

17

Shannon’s Definition of Capacity

Random variable case derivation

( )( ) ( ) ( )

( ) ( )

( )

⎟⎠⎞

⎜⎝⎛ +=

−+=

−=

−=

2

22

1log21

2log212log

21

|;max

σ

σπσπ

π

P

ePe

nHyH

xyHyHyxIX

Page 18: Capacity for Communication Channels

18

Shannon’s Definition of Capacity

What is the capacity if noise is unknown?Gaussian AVC

Statistic of is unknown

( )2,0~, σNnsnxy ttttt ++=

ts

Page 19: Capacity for Communication Channels

19

Shannon’s Definition of Capacity

Additive Gaussian ChannelsRandom Process Case

+

( )fW

( )fH

n

yx

Page 20: Capacity for Communication Channels

20

Shannon’s Definition of Capacity

Random process case derivation

{ }

( ) ( )PdfSdfWS

HSSJ

PdfSSA

dfWS

HSC

xn

xx

xx

n

x

ASx

−+⎟⎟

⎜⎜

⎛+=

≤=

⎟⎟

⎜⎜

⎛+=

∫∫

∫∈

λ2

2

2

2

1log21

;

1log21sup

Page 21: Capacity for Communication Channels

21

Shannon’s Definition of Capacity

Necessary conditions

( )∫

=−

>−=

>=−=+

⎟⎟

⎜⎜

⎛+=

+

PdfHWS

HWSS

HWSS

dfWS

HSC

n

nx

nx

n

x

ASx

22*

22**

**

22*

2

2

/)2

0/

01/

1)1

1log21sup

ν

ν

νλ

Page 22: Capacity for Communication Channels

22

Shannon’s Definition of Capacity

Capacity of continuous time additive Gaussian channel

∫ ⎟⎟

⎜⎜

⎛= df

WS

HC

n2

2*log

21 ν

PdfH

WSn =⎟⎟

⎜⎜

⎛−∫

+

2

2

Page 23: Capacity for Communication Channels

23

Shannon’s Definition of Capacity

Range of integration

What is the capacity if the frequency response of the channel H, or psd of the noise Sn belong to certain sets?

0*,0* 2

2

>≥− ννH

WSn

Page 24: Capacity for Communication Channels

24

Shannon’s Definition of Capacity

Water-filling

*ν2

2

H

WSn

fBfBf−

psd

Page 25: Capacity for Communication Channels

25

MIMO Channels

Page 26: Capacity for Communication Channels

26

MIMO Channels

Static MIMO Gaussian Channel

n is zero-mean complex Gaussian noise with independent, equal variance real, and imaginary parts,

[ ][ ] [ ]( ) PxxEtracexxE

InnEnHxy r

≤=

=+=**

*,

trrt CHCyCx ×∈∈∈ ,,

Page 27: Capacity for Communication Channels

27

MIMO Channels

Capacity of static MIMO channel

are the eigenvalues of HH*, ν is Lagrange multiplierWhat is the capacity if the matrix H belongs to a certain set?

( )( )( )∑

∑+

+−

=

=−

ii

ii

C

P

νλ

λν

log

1

{ }iλ

Page 28: Capacity for Communication Channels

28

Review of MaximinCapacity

Page 29: Capacity for Communication Channels

29

Review of Minimax Capacity

Example: compound DMC

This result is due to Blackwell et. al. [6]. Also look at Csiszar [8], and Wolfowitz [21] Blachman [5], and Dobrushin [12] were first to apply game theoretic approach in computing the channel capacity with mutual information as a pay-off function for discrete channels

( )( )( )θ

θπ;|,infmax ⋅⋅=

Θ∈∈QPIC

XP

Page 30: Capacity for Communication Channels

30

Review of Minimax Capacity

The existence of saddle point ?

For further references see Lapidoth, Narayan [18]

( )( )( )

( )( )( )

( )( ) ( )( ) ( )( )θθθ

θθπθθπ

;|,;|,;|,

;|,maxinf;|,infmax

**** ⋅⋅≤⋅⋅≤⋅⋅

⋅⋅=⋅⋅=∈Θ∈Θ∈∈

QPIQPIQPI

QPIQPICXPXP

Page 31: Capacity for Communication Channels

31

Review of Minimax Capacity

Gaussian AVC (GAVC) ChannelsHughes, and Narayan [16] determined the λ-capacity of discrete time GAVC for averaged, and peak power constraints imposed on channel, and transmitted sequence for random codes Hughes, and Narayan [17] determined the capacity of vector discrete time GAVC accompanied with water-filling equation, and proved the saddle pointCsiszar, and Narayan [9] determined the capacity of GAVC for deterministic codesAhlswede [1] computed the channel capacity of GAVC when the noise variance varies but does not exceed certain bound

Page 32: Capacity for Communication Channels

32

Review of Minimax Capacity

Gaussian AVC (GAVC) ChannelsMcEliece [22] was first to apply game theoretic approach on continuous channels with mutual information as pay-off functionBasar, and Wu [3] considered jamming Gaussian channels with mean-square error pay-off functionDiggavi, and Cover [23] used the game theoretic approach to find the worst case noise when the covariance matrix constraint is imposed on the noiseBaker [2] computed the channel capacity of M-dimensional Gaussian continuous time channels with energy constraints posed on the transmitted signal, and jammerRoot, and Varaiya [20] considered the capacity of white noise Gaussian continuous time compound channel

Page 33: Capacity for Communication Channels

33

Review of Minimax Capacity

Gaussian AVC (GAVC) ChannelsVishwanath, et. al. [24], computed the worst case capacity of Gaussian MIMO channels by applying duality principle

Page 34: Capacity for Communication Channels

34

Maximin Capacity Subject to NormedUncertainties

Page 35: Capacity for Communication Channels

35

Motivation

We deal with one class of channels, time continuous Gaussian compound channelsThe choice of Gaussian noise does not present a limitation because it is known that Gaussian noise is the worst case noise in terms of mutual information (see [15])As opposed to GAVC that models the uncertainty through the additive (jamming) signal with unknown statistics (see slide 11), we model the uncertainty through the uncertainty set of all possible frequency responses of the channel (see [13], [20])

Page 36: Capacity for Communication Channels

36

Motivation

This approach is appealing because the uncertainty set is well defined in the normedlinear spaces H∞, and L1

It enables the treatment of both parametric, and non-parametric uncertaintiesIt is practical because the uncertainty set can be extracted from Nyquist, or Bode plotsThe obtained channel capacity formula explicitly depends on the size of the uncertainty set

Page 37: Capacity for Communication Channels

37

Motivation

It gives the optimal transmitted power in the form of water-filling that depends on the size of uncertainty setOur computation does not require the saddle point because it relies on the work of Root, and Varaiya [20], and Gallager [15]It enables us to deal in the same time with two types of uncertainty (frequency response of the channel, and noise uncertainty) that has not been done until now (to our best knowledge)

Page 38: Capacity for Communication Channels

38

Motivation

Our approach gives the solution to the jamming problem for continuous time channels accompanied with optimal transmitter and jammer strategies in terms of optimal PSD. We show that the optimal PSD of the signal is proportional to the optimal PSD of the noise Although, we consider similar problem to that of Baker’s paper [2], his constraints on the signal, and noise are in terms of the energy of the signal while we deal with power constraints

Page 39: Capacity for Communication Channels

39

Communication system model

Most of the models that are used are probabilistic (compound channel, and arbitrary varying channel with and without memory) and finite dimensional channelsPresent model employs unknown transfer functions; transmitted signal is constrained in power with known PSD, noise is Gaussian whose PSD belongs to a certain set

Page 40: Capacity for Communication Channels

40

Communication system model

Modeln

+

( )fW

( )fHyx

Page 41: Capacity for Communication Channels

41

Communication system model

Both could be unknownUnknown transfer function can be model by using additive uncertainty

and where

Other models are possible: multiplicative

( ) ( ) ( ) ( )fWffGfG nom 1∆+=

( ) ( )fWfH ,( )fG

( ) ( ) ( ) ( ) ( ) 1,,, 1 ≤∆∈∆∞

∞ fHfWffGfG nom

( ) ( ) ( ) ( )( )fWffGfG nom 11 ∆+=

( ) ( )fGfGf

sup=∞

Page 42: Capacity for Communication Channels

42

Communication system model

Uncertainty models: additive and multiplicative

( ) ( )fWf 1∆

( )fGnom +( ) ( )fWf 1∆

( )fGnom +

Page 43: Capacity for Communication Channels

43

Communication system model

Example

α/β(1-δ)

Re

Im

α/β(1+δ)

α/β

( ) βαξβπ

α /,2

=+

=fj

fGnom

( ) ( )

( ) ( ) 10,

1/2

<<∆+=

+=

δδξξξ

βπξ

ff

fjf

fG

p

p

Page 44: Capacity for Communication Channels

44

Communication system model

The uncertainty set is described by the ball in frequency domain centered at and with radius of

( ) ( ) ( )βπ

ξδ+

=≤−fj

fWfGfG nom 21

( )fW1

( )fGnom

Page 45: Capacity for Communication Channels

45

Channel capacity with uncertainty

Define four sets

{ ,; 222 WWWHWA nom ∆+=∈= ∞

}1,,, 222 ≤∆∈∈∆∈∞

∞∞∞ HWHHWnom

( ) ( ){ }∫ ≤= xxx PdffSfSA ;1

{ ,; 113 WHHHHA nom ∆+=∈= ∞

}1,,, 111 ≤∆∈∈∆∈∞

∞∞∞ HWHHHnom

Page 46: Capacity for Communication Channels

46

Channel capacity with uncertainty

Overall PSD of noise is and uncertainty is modeled by uncertainty of filter

or by the set A4

( ) ( ){ }∫ ≤= nnn PdffSfSA ;4

( ) ( ) 2fWfSn

( ) ( ) ( ) ( )fWffWfW nom 22∆+=

Page 47: Capacity for Communication Channels

47

Channel capacity with uncertainty

Mutual information rate is pay-off function

( ) ∫ ⎟⎟

⎜⎜

⎛+= df

WS

HSyxI

n

x2

2

1log21;

Page 48: Capacity for Communication Channels

48

Channel capacity with uncertainty I

Three problems could be definedNoise uncertainty

Channel uncertainty

∫ ⎟⎟

⎜⎜

∆++=

∈∈df

WWS

HSC

nomn

x

AWASNU

x2

22

2

1log21infsup

21

∫ ⎟⎟

⎜⎜

⎛ ∆++=

∈∈df

WS

WHSC

n

nomx

AHASCU

x2

2111log

21infsup

31

Page 49: Capacity for Communication Channels

49

Channel capacity with uncertainty I

Channel – noise uncertainty

∫ ⎟⎟

⎜⎜

∆+

∆++=

∈∈∈df

WWS

WHSC

nomn

nomx

AHAWASCNU

x2

22

2111log

21infinfsup

321

Page 50: Capacity for Communication Channels

50

Channel capacity with uncertainty I

Channel capacity with channel-noise uncertainty

Theorem 1. Assume that

is bounded and integrable.

( )( )22

21

WWS

WH

nomn

nom

+

∫ ⎟⎟

⎜⎜

∆+

∆++=

∈∈∈df

WWS

WHSC

nomn

nomx

AHAWASCU

x2

22

2111log

21infinfsup

321

Page 51: Capacity for Communication Channels

51

Channel capacity with uncertainty I

Channel capacity is given parametrically

( )( )∫ ⎟

⎜⎜

+

−= df

WWS

WHC

nomn

nomNU 2

2

21*

log21 ν

( )( ) x

nom

nomn PdfWH

WWS=

⎟⎟

⎜⎜

+−∫

+

21

22*ν

Page 52: Capacity for Communication Channels

52

Channel capacity with uncertainty I

Such that

where ν* is related to Lagrange multiplier, and is obtained from constraint equation.Infimum over noise uncertainty is achieved at

( )( ) 0*,0* 2

1

22 >≥

+− νν

WH

WWS

nom

nomn

( ) ( )( ) ( )( )( ) ( ) 1,argargexp *22

*2 =∆+−=∆

∞ffWjfWjf nom

Page 53: Capacity for Communication Channels

53

Channel capacity with uncertainty I

Mutual information rate after minimization is give as

( )∫

⎟⎟

⎜⎜

++=

⎟⎟

⎜⎜

∆++

≤∆ ∞

dfWWS

HS

dfWWS

HS

nomn

x

nomn

x

22

2

222

2

1

1log

1loginf2

Page 54: Capacity for Communication Channels

54

Channel capacity with uncertainty I

Infimum over channel uncertainty is achieved at

( ) ( )( ) ( )( )( )( ) 1

,argargexp*1

1*1

=∆

++−=∆

∞f

jfHjfWjf nom π

Page 55: Capacity for Communication Channels

55

Channel capacity with uncertainty I

Mutual information rate after second minimization is give as

( )

( )( )∫

⎟⎟

⎜⎜

+

−+=

⎟⎟

⎜⎜

∆+

−+

≤∆ ∞

dfWWS

WHS

dfWWS

WHS

nomn

nomx

nomn

nomx

22

21

222

21

1

1log

1loginf1

Page 56: Capacity for Communication Channels

56

Channel capacity with uncertainty I

Maximization gives water – filling equation

( )( ) *2

1

22* ν=

++

WH

WWSS

nom

nomnx

Page 57: Capacity for Communication Channels

57

Channel capacity with uncertainty I

Water – filling

( )( )21

22

WH

WWSn

+

fBfBf−

psd

Page 58: Capacity for Communication Channels

58

Channel capacity with uncertainty I

Channel capacity with channel uncertainty 02 =W

( )∫ ⎟

⎜⎜

⎛ −= df

WS

WHC

nomn

nomNU 2

21*

log21 ν

( ) xnom

nomn PdfWH

WS=

⎟⎟

⎜⎜

−−∫

+

21

2

Page 59: Capacity for Communication Channels

59

Channel capacity with uncertainty I

Channel capacity with noise uncertainty 01 =W

( )∫ ⎟⎟

⎜⎜

+= df

WWS

HC

nomn

nomNU 2

2

2*log

21 ν

( )x

nom

nomn PdfH

WWS=

⎟⎟

⎜⎜

⎛ +−∫

+

2

22*ν

Page 60: Capacity for Communication Channels

60

Channel capacity with uncertainty II

Second approachNoise uncertainty

Channel – noise uncertainty

∫ ⎟⎟

⎜⎜

⎛+=

∈∈df

WS

HSC

n

x

ASASNU

nx2

2

1log21infsup

41

∫ ⎟⎟

⎜⎜

⎛ ∆++=

∈∈∈df

WS

WHSC

n

nomx

AHASASCNU

nx2

2111log

21infinfsup

341

Page 61: Capacity for Communication Channels

61

Channel capacity with uncertainty II

Channel capacity with channel-noise uncertainty

Theorem 2. Assume that

is bounded and integrable,

∫ ⎟⎟

⎜⎜

⎛ ∆++=

∈∈∈df

WS

WHSC

n

nomx

AHASASCNU

nx2

2111log

21infinfsup

341

2

2

WS

HS

n

x

WWH

R nom 1−=

Page 62: Capacity for Communication Channels

62

Channel capacity with uncertainty II

Channel capacity is given as

where are Lagrange multipliers

WWH

RdfRCCNU12

*2

*1 ;1log

21 −

=⎟⎟⎠

⎞⎜⎜⎝

⎛+= ∫ λλ

( )*

*2

*1

*2

2*1

*2

2*1*

2

1

2*2*

2

021

nx

xxnn

SRRS

RSRSSS

λλ

λλλλ

λ

=+

=

=−+

0, *2

*1 >λλ

Page 63: Capacity for Communication Channels

63

Channel capacity with uncertainty II

Integral constraints should be satisfied with equality

Power spectral densities satisfy water-filling

nn

xx

PdfS

PdfS

=

=

∫∫

*

*

*2

*2*

21λ

=+−xn SRS

Page 64: Capacity for Communication Channels

64

Channel capacity with uncertainty II

Capacity for noise uncertainty case is obtained for

Theorem 3. Assume that

is bounded, and integrable. Define sets

01 =W

2

2

WS

HS

n

x

( ) ( )⎪⎭

⎪⎬⎫

⎪⎩

⎪⎨⎧

≤= ∫ xxx PdffSfSA ;1 ( ) ( )⎪⎭

⎪⎬⎫

⎪⎩

⎪⎨⎧

≤= ∫ nnn PdffSfSA ;4

01 =W

Page 65: Capacity for Communication Channels

65

Channel capacity with uncertainty II

The lower value C- of pay-off function (mutual information rate, slide 40) is defined as

and is given by Theorem 2. The upper value C+

is defined by

∫ ⎟⎟

⎜⎜

⎛+==

∈∈

− dfWS

HSCC

n

x

ASASNU

nx2

2

1log21infsup

41

∫ ⎟⎟

⎜⎜

⎛+=

∈∈

+ dfWS

HSC

n

x

ASASxn

2

2

1log21supinf

14

Page 66: Capacity for Communication Channels

66

Channel capacity with uncertainty II

The lower value C- is equal to the upper value C+ implying that the saddle point existsThe optimal PSD of the noise is proportional to the optimal PSD of the transmitter, which can be explained such that the players in the game try to match each other

Page 67: Capacity for Communication Channels

67

Channel Coding Theorem

Define the frequency response of equivalent channel

with impulse response and ten sets

( )2/1

2

2

⎟⎟

⎜⎜

⎛=

WS

HSfG

n

x

( )tg

{ }3211 ,,; AHAWASGB x ∈∈∈=

Page 68: Capacity for Communication Channels

68

Channel Coding Theorem

{ }0,,; 2312 =∆∈∈= AHASGB x

{ }0,,; 1213 =∆∈∈= AWASGB x

{ }0,,,; 23414 =∆∈∈∈= AHASASGB nx

{ }0,0,,; 21415 =∆=∆∈∈= ASASGB nx

( ) ( ) ( ){ } 5,...,1,)3),2),1,; =∈= isatisfiestgBfGtgK ii

Page 69: Capacity for Communication Channels

69

Channel Coding Theorem

1) has finite duration δ2)

3)

Sets Ki are conditionally compact sets.

( )tg( ) 2Ltg ∈

( ) ( ) +∞→→+ ∫∫+∞−

∞−

AdffGdffGA

A

,022

Page 70: Capacity for Communication Channels

70

Channel Coding Theorem

Positive number Ri is called attainable rate for the set of channels Ki if there exists a sequence of codes such that when then uniformly over set Ki.

Theorem 4. The operational capacities Ci (supremum of all attainable rates Ri) for the sets of communication channels with the uncertainties Ki ,(i=1,…,5) are given by corresponding formulas in Theorem 1, and Theorem 2.

Proof. Follows from [15], and [20] (see [11])

( ){ }nnRT Te n ,,ε

+∞→nT 0→nε

Page 71: Capacity for Communication Channels

71

Example 1

Uncertain channel, white noiseTransfer function

( )

( ) ( )

( ) ( ) βαξδδξξξβπ

ξβπβα

/,10,1/2

1/2/

1 =<≤∆+=+

=

+=

fffj

ffH

fjfH

p

p

nom

( ) ( ) ( ) ( ) ( )1/2111 +

=≤∆=−βπξδ

fjfWfWffHfH nom

Page 72: Capacity for Communication Channels

72

Example 1

Channel capacity

⎪⎪⎭

⎪⎪⎬

⎪⎪⎩

⎪⎪⎨

⎧⎟⎠⎞

⎜⎝⎛

⎟⎠⎞

⎜⎝⎛

−⎟⎠⎞

⎜⎝⎛

⎟⎠⎞

⎜⎝⎛= −

β

π

βππ

3/16/1

13/16/1

49

tan491 c

P

cPCCU

( ) ( )⎪⎩

⎪⎨

⎧⎟⎠⎞

⎜⎝⎛

⎟⎠⎞

⎜⎝⎛≤−⎟

⎠⎞

⎜⎝⎛

==

otherwisecPcPcfS fx

,049,

49

|

3/16/12

3/12

2

πωωππω

( )220

12 δα −=

Nc

Page 73: Capacity for Communication Channels

73

Example 1

δ

P = 10-2 WN0 = 10-8 W/Hzβ = 1000 rad/s

⎪⎩

⎪⎨

⎧=

1000500250

α

Page 74: Capacity for Communication Channels

74

Example 2

Uncertain noiseTransfer function

Noise uncertainty description

( )( ) ( ) 22

2

222 nn

n

fjfjfH

ωπξωπω

++=

( )βπ

ξδ/22 fj

fW =( ) ( )

( ) ( ) βαξδδξξξβπ

ξ

/,10,1/2

=<≤∆+=+

=

fffj

ffW p

2p

Page 75: Capacity for Communication Channels

75

Example 2

δ

( )sradn /700=ω

( )sradn /1000=ω

( )sradn /1300=ω

( )srad

WP

/10001

2.001.0

====

βαξ

Page 76: Capacity for Communication Channels

76

Example 2

( )( )srad

srad

WP

n /1300/1000

12.001.0

=====

ωβαξ

0=δ

2.0=δ

1.0=δ

Page 77: Capacity for Communication Channels

77

Example 3

Uncertain channel, uncertain noise

Damping ration is uncertain

The noise uncertainty is modelled as in the Example 2

( )( ) ( ) 22

2

222 nn

n

fjfjfH

ωπξωπω

++=

ξ

Page 78: Capacity for Communication Channels

78

Example 3

Page 79: Capacity for Communication Channels

79

Example 3

Page 80: Capacity for Communication Channels

80

References

[1] Ahlswede, R., “The capacity of a channel with arbitrary varying Gaussian channel probability functions”, Trans. 6th Prague Conf. Information Theory, Statistical Decision Functions, and Random Processes, pp. 13-31, Sept. 1971.

[2] Baker, C. R., Chao, I.-F., “Information capacity of channels with partially unknown noise. I. Finite dimensional channels”, SIAM J. Appl. Math., vol. 56, no. 3, pp. 946-963, June 1996.

[3] Basar, T., “A complete characterization of minimax, and maximinencoder-decoder policies for communication channels with incomplete statistical description”, IEEE Transactions on Information Theory, vol. 31, pp. 482-489, Jan., 1985.

Page 81: Capacity for Communication Channels

81

References

[4] Biglieri, E., Proakis, J., Shamai, S., “Fading channels: information-theoretic and communications aspects,” IEEE Transactions on Information Theory, vol. 44, no. 6, pp. 2619-2692, October, 1998.

[5] Blachman, N. M., “Communication as a game”, IRE Wescon 1957 Conference Record, vol. 2, pp. 61-66, 1957.

[6] Blackwell, D., Breiman, L., Thomasian, A. J., “The capacity of a class of channels”, Ann. Math. Stat., vol. 30, pp. 1229-1241, 1959.

[7] Charalambous, C. D., Denic, S. Z., Djouadi, S. M. "Robust Capacity of White Gaussian Noise Channels with Uncertainty", accepted for43th IEEE Conference on Decision and Control.

Page 82: Capacity for Communication Channels

82

References

[8] Csiszar, I., Korner, J., Information theory: Coding theorems for discrete memoryless systems. New York: Academic Press, 1981.

[9] Csiszar, I., Narayan P., “Capacity of the Gaussian arbitrary varying channels”, IEEE Transactions on Information Theory, vol. 37, no. 1, pp. 18-26, Jan., 1991.

[10] Denic, S. Z., Charalambous, C. D., Djouadi, S.M., “Capacity of Gaussian channels with noise uncertainty”, Proceedings of IEEE CCECE 2004, Canada.

[11] Denic, S.Z., Charalambous, C.D., Djouadi, S.M., “Robust capacity for additive colored Gaussian uncertain channels,” preprint.

Page 83: Capacity for Communication Channels

83

References

[12] Dobrushin, L. “Optimal information transmission through a channel with unknown parameters”, Radiotekhnika i Electronika, vol. 4, pp. 1951-1956, 1959.

[13] Doyle, J.C., Francis, B.A., Tannenbaum, A.R., Feedback control theory, New York: McMillan Publishing Company, 1992.

[14] Forys, L.J., Varaiya, P.P., “The ε-capacity of classes of unknown channels,” Information and control, vol. 44, pp. 376-406, 1969.

[15] Gallager, G.R., Information theory and reliable communication.New York: Wiley, 1968.

Page 84: Capacity for Communication Channels

84

References

[16] Hughes, B., Narayan P., “Gaussian arbitrary varying channels”, IEEE Transactions on Information Theory, vol. 33, no. 2, pp. 267-284, Mar., 1987.

[17] Hughes, B., Narayan P., “The capacity of vector Gaussian arbitrary varying channel”, IEEE Transactions on Information Theory, vol. 34, no. 5, pp. 995-1003, Sep., 1988.

[18] Lapidoth, A., Narayan, P., “Reliable communication under channel uncertainty,” IEEE Transactions on Information Theory, vol. 44, no. 6, pp. 2148-2177, October, 1998.

[19] Medard, M., “Channel uncertainty in communications,” IEEE Information Theory Society Newsletters, vol. 53, no. 2, p. 1, pp. 10-12, June, 2003.

Page 85: Capacity for Communication Channels

85

References

[20] Root, W.L., Varaiya, P.P., “Capacity of classes of Gaussian channels,” SIAM J. Appl. Math., vol. 16, no. 6, pp. 1350-1353, November, 1968.

[21] Wolfowitz, Coding Theorems of Information Theory, Springer –Verlang, Belin Heildelberg, 1978.

[22] McElice, R. J., “Communications in the presence of jamming – An information theoretic approach, in Secure Digital Communications, G. Longo, ed., Springer-Verlang, New York, 1983, pp. 127-166.

[23] Diggavi, S. N., Cover, T. M., “The worst additive noise under a covariance constraint”, IEEE Transactions on Information Theory, vol. 47, no. 7, pp. 3072-3081, November, 2001.

Page 86: Capacity for Communication Channels

86

References

[24] Vishwanath, S., Boyd, S., Goldsmith, A., “Worst-case capacity of Gaussian vector channels”, Proceedings of 2003 Canadian Workshop on Information Theory.

[25] Shannon, C.E., “Mathematical theory of communication”, Bell Sys. Tech. J., vol. 27, pp. 379-423, pp. 623-656,July, Oct, 1948