3/23/04 1 information rates for two-dimensional isi channels jiangxin chen and paul h. siegel center...

40
3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California, San Diego DIMACS Workshop March 22-24, 2004

Post on 21-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 1

Information Rates for Two-Dimensional ISI Channels

Jiangxin Chen and Paul H. Siegel

Center for Magnetic Recording Research

University of California, San Diego

DIMACS Workshop

March 22-24, 2004

Page 2: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 2DIMACS Workshop

Outline

• Motivation: Two-dimensional recording

• Channel model

• Information rates

• Bounds on the Symmetric Information Rate (SIR)• Upper Bound

• Lower Bound

• Convergence

• Alternative upper bound

• Numerical results

• Conclusions

Page 3: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 3DIMACS Workshop

Two-Dimensional Channel Model

• Constrained input array

• Linear intersymbol interference

• Additive, i.i.d. Gaussian noise

1

0

1

0

21

],[],[],[],[n

l

n

k

jinljkixlkhjiy

]j,i[x

],[ jih

)( 2,0~],[ Njin

Page 4: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 4DIMACS Workshop

Two-Dimensional Processes

• Input process:

• Output process:

• Array

upper left corner:

lower right corner:

]j,i[XX

]j,i[YY

11 nj,mij,iY

]j,i[Y

]nj,mi[Y 11

Page 5: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 5DIMACS Workshop

Entropy Rates

• Output entropy rate:

• Noise entropy rate:

• Conditional entropy rate:

n,m,n,m YH

mnlimYH 11

1

02

1eNlogNH

NHX|YHmn

limX|YH n,m,

n,m,

n,m

1111

1

Page 6: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 6DIMACS Workshop

Mutual Information Rates

• Mutual information rate:

• Capacity:

• Symmetric information rate (SIR): Inputs are constrained to be independent, identically distributed, and equiprobable binary.

NHYHX|YHYHY;XI

],[ jixX

Y;XImaxC

XP

Page 7: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 7DIMACS Workshop

Capacity and SIR

• The capacity and SIR are useful measures of the achievable storage densities on the two-dimensional channel.

• They serve as performance benchmarks for channel coding and detection methods.

• So, it would be nice to be able to compute them.

Page 8: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 8DIMACS Workshop

Finding the Output Entropy Rate

• For one-dimensional ISI channel model:

and

where

n

nYH

nYH 1

1lim

nnn yYpEYH 111 log

nYYYY n ,2,11

Page 9: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 9DIMACS Workshop

Sample Entropy Rate

• If we simulate the channel N times, using inputs with specified (Markovian) statistics and generating output realizations

then

converges to with probability 1 as .N nYH 1

Nknyyyy kkkk ,,2,1,][,,]2[,]1[ )()()()(

N

k

kypN 1

)(log1

Page 10: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 10DIMACS Workshop

Computing Sample Entropy Rate

• The forward recursion of the sum-product (BCJR)algorithm can be used to calculate the probability of a sample realization of the channel output.

• In fact, we can write

where the quantity is precisely the normalization constant in the (normalized) forwardrecursion.

n

i

ii

n y|yplogn

yplogn 1

111

11

11i

i y|yp

nyP 1

Page 11: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 11DIMACS Workshop

Computing Entropy Rates

• Shannon-McMillan-Breimann theorem implies

as , where is a single long

sample realization of the channel output

process.

ny1

YHyplogn .s.a

n 11

n

Page 12: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 12DIMACS Workshop

SIR for Partial-Response Channels

Page 13: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 13DIMACS Workshop

Capacity Bounds for Dicode

Page 14: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 14DIMACS Workshop

Markovian Sufficiency

Remark: It can be shown that optimized Markovian processes whose states are determined by their previous r symbols can asymptotically achieve the capacity of finite-state intersymbol interference channels with AWGN as the order r of the input process approaches .

(J. Chen and P.H. Siegel, ISIT 2004)

Page 15: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 15DIMACS Workshop

Capacity and SIR in Two Dimensions

• In two dimensions, we could estimate by

calculating the sample entropy rate of a very large

simulated output array.

• However, there is no counterpart of the BCJR

algorithm in two dimensions to simplify the

calculation.

• Instead, we use conditional entropies to derive upper

and lower bounds on .

YH

YH

Page 16: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 16DIMACS Workshop

Array Ordering

• Permuted lexicographic ordering:

• Choose vector , a permutation of .

• Map each array index to .

• Then precedes if

or and .

• Therefore, row-by-row ordering column-by-column ordering

21 k,kk 21,

21 t,t kk t,t21

21 t,t 21 s,s

kk ts11

kk ts11

kk ts22

:2,1k :1,2k

Page 17: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 17DIMACS Workshop

Two-Dimensional “Past”

• Let be a non-negative

vector.

• Define to be the elements

preceding inside the region

(with permutation k )

4321 l,l,l,ll

j,il,k YPast

j,iY

42

31

lj,lilj,liY

Page 18: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 18DIMACS Workshop

Examples of Past{Y[i,j]}

Page 19: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 19DIMACS Workshop

Conditional Entropies

• For a stationary two-dimensional random field Y on the integer lattice, the entropy rate satisfies:

(The proof uses the entropy chain rule. See [5-6]) • This extends to random fields on the hexagonal

lattice,via the natural mapping to the integer lattice.

j,i,kj,i YPastYHYH |

Page 20: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 20DIMACS Workshop

Upper Bound on H(Y)

• For a stationary two-dimensional random field Y,

where

1Ul,k

kHminYH

j,il,kj,i YPastYHYH |Ul,k 1

Page 21: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 21DIMACS Workshop

Two-Dimensional Boundary of Past{Y[i,j]}

• Define to be the boundary

of .

• The exact expression for

is messy, but the geometrical concept is

simple.

j,il,k YPast

j,il,kStrip Y

j,il,kStrip Y

Page 22: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 22DIMACS Workshop

Two-Dimensional Boundary of Past{Y[i,j]}

Page 23: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 23DIMACS Workshop

Lower Bound on H(Y)

• For a stationary two-dimensional hidden Markov field Y,

where

and is the “state information” for

the strip .

1Ll,k

kHmaxYH

j,iYl,kStX,j,il,kj,i YPastYHYH |Ll,k 1

j,iYl,kStX

j,iYl,kStrip

Page 24: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 24DIMACS Workshop

Sketch of Proof

• Upper bound:

Note that

and that conditioning reduces entropy.

• Lower bound:

Markov property of , given “state

information” .

j,i,kj,il,k YPastYPast

j,iY

j,iYl,kStX

Page 25: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 25DIMACS Workshop

Convergence Properties

• The upper bound on the entropy rate is

monotonically non-increasing as the size of the

array defined by increases.

• The lower bound on the entropy rate is

monotonically non-decreasing as the size of the

array defined by increases.

1Ul,kH

4321 l,l,l,ll

1Ll,kH

4321 l,l,l,ll

Page 26: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 26DIMACS Workshop

Convergence Rate

• The upper bound and lower bound

converge to the true entropy rate at least

as fast as O(1/lmin) , where

1Ul,kH

k orderingcolumn -by-columnfor ,, min

k ordering row-by-rowfor , ,, min

321

431min lll

llll

1Ll,kH

YH

Page 27: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 27DIMACS Workshop

Computing the SIR Bounds

• Estimate the two-dimensional conditional entropies

over a small array.

• Calculate to get

for many realizations of output array.

• For column-by-column ordering, treat each row

as a variable and calculate the joint

probability row-by-row

using the BCJR forward recursion.

BAP

BAH

BPBAP ,,

iY

mYYYP ,,, 21

Page 28: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 28DIMACS Workshop

2x2 Impulse Response

• “Worst-case” scenario - large ISI:

• Conditional entropies computed from 100,000 realizations.

• Upper bound:

• Lower bound:

(corresponds to element in middle of last column)

5.05.0

5.05.0],[1 jih

1 ,log

2

1min 0

10,3,7,7,1,2 eNH U

01

0,3,7,7,1,2 log2

1eNH L

Page 29: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 29DIMACS Workshop

Two-Dimensional “State”

Page 30: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 30DIMACS Workshop

SIR Bounds for 2x2 Channel

Page 31: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 31DIMACS Workshop

Computing the SIR Bounds

• The number of states for each variable increases exponentially with the number of columns in the

array.

• This requires that the two-dimensional impulse response have a small support region.

• It is desirable to find other approaches to computing bounds that reduce the complexity, perhaps at the cost of weakening the resulting bounds.

Page 32: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 32DIMACS Workshop

Alternative Upper Bound

• Modified BCJR approach limited to small impulse response support region.

• Introduce “auxiliary ISI channel” and bound

where

and is an arbitrary conditional

probability distribution.

2Ul,kHYH

ydjilk

jiyqjilk

jiypH yPastyPastUlk

,

,,log,

,,, |2

,

ji

lkjiyq yPast ,

,, |

Page 33: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 33DIMACS Workshop

Choosing the Auxiliary Channel

• Assume is conditional probability distribution of the output from an auxiliary ISI channel

• A one-dimensional auxiliary channel permits a calculation based upon a larger number of columns in the output array.

• Conversion of the two-dimensional array into a one-dimensional sequence should “preserve” the statistical properties of the array.

• Pseudo-Peano-Hilbert space-filling curves can be used on a rectangular array to convert it to a sequence.

ji

lkjiyq yPast ,

,, |

Page 34: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 34DIMACS Workshop

Pseudo-Peano-Hilbert Curve

j,iYPastj,iY l,,,,, 478712

Page 35: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 35DIMACS Workshop

SIR Bounds for 2x2 Channel

Alternative upper bounds --------->

Page 36: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 36DIMACS Workshop

3x3 Impulse Response

• Two-DOS transfer function

• Auxiliary one-dimensional ISI channel with memory

length 4.

• Useful upper bound up to Eb/N0 = 3 dB.

011

121

110

101],[2 jih

Page 37: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 37DIMACS Workshop

SIR Upper Bound for 3x3 Channel

Page 38: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 38DIMACS Workshop

Concluding Remarks

• Upper and lower bounds on the SIR of two-dimensional finite-state ISI channels were presented.

• Monte Carlo methods were used to compute the bounds for channels with small impulse response support region.

• Bounds can be extended to multi-dimensional ISI channels

• Further work is required to develop computable, tighter bounds for general multi-dimensional ISI channels.

Page 39: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 39DIMACS Workshop

References

1. D. Arnold and H.-A. Loeliger, “On the information rate of binary-input channels with memory,” IEEE International Conference on Communications, Helsinki, Finland, June 2001, vol. 9, pp.2692-2695.

2. H.D. Pfister, J.B. Soriaga, and P.H. Siegel, “On the achievable information rate of finite state ISI channels,” Proc. Globecom 2001, San Antonio, TX, November2001, vol. 5, pp. 2992-2996.

3. V. Sharma and S.K. Singh, “Entropy and channel capacity in the regenerative setup with applications to Markov channels,” Proc. IEEE International Symposium on Information Theory, Washington, DC, June 2001, p. 283.

4. A. Kavcic, “On the capacity of Markov sources over noisy channels,” Proc. Globecom 2001, San Antonio, TX, November2001, vol. 5, pp. 2997-3001.

5. D. Arnold, H.-A. Loeliger, and P.O. Vontobel, “Computation of information rates from finite-state source/channel models,” Proc.40th Annual Allerton Conf. Commun., Control, and Computing, Monticello, IL, October 2002, pp. 457-466.

Page 40: 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

3/23/04 40DIMACS Workshop

References

6. Y. Katznelson and B. Weiss, “Commuting measure-preserving transformations,” Israel J. Math., vol. 12, pp. 161-173, 1972.

7. D. Anastassiou and D.J. Sakrison, “Some results regarding the entropy rates of random fields,” IEEE Trans. Inform. Theory, vol. 28, vol. 2, pp. 340-343, March 1982.