an invitation to research in information...

87
An Invitation to Research in Information Theory Andrew Thangaraj Department of Electrical Engineering Indian Institute of Technology Madras Email: [email protected] Joint Work with Rahul Vaze (TIFR Mumbai), Radhakrishna Ganti, Arijit Mondal (IIT Madras) Thanks to Gerhard Kramer, Georg B¨ ocherer (TU Munich) Navin Kashyap (IISc Bangalore) Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 1 / 44

Upload: others

Post on 07-Apr-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

An Invitation to Research in Information Theory

Andrew Thangaraj

Department of Electrical EngineeringIndian Institute of Technology Madras

Email: [email protected]

Joint Work with Rahul Vaze (TIFR Mumbai),Radhakrishna Ganti, Arijit Mondal (IIT Madras)

Thanks to Gerhard Kramer, Georg Bocherer (TU Munich)Navin Kashyap (IISc Bangalore)

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 1 / 44

Page 2: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Outline of Talk

1 Information Theory for Communication SystemsBackground, Current Status and ResearchNoisy Runlength Constrained Channels1-bit Quantized ISI Channels

2 Online Algorithms for Basestation AllocationSecretary problemBasestation allocation and graph matchingPerformance studies and comparison

3 Information Theory Research in IndiaActivitiesIIT Madras

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 2 / 44

Page 3: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Communications, Information Theory and Coding Theory

BitsModem:bits to signal

Impairments

NoiseModem:signal to bits

Bits

Rate

SNR

Rate, Signal-to-Noise Ratio (SNR), Prob{Bits 6= Bits}

Metrics

Channel

Channel capacity:function of SNR

Encodemsg

Decodemsg

Information theory: model channel and compute capacityI If Rate < Capacity, Prob{Bits 6= Bits} can be as small as desired

Coding theory: encoders and decoders to approach capacity

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 3 / 44

Page 4: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Communications, Information Theory and Coding Theory

BitsModem:bits to signal

Impairments

NoiseModem:signal to bits

Bits

Rate

SNR

Rate, Signal-to-Noise Ratio (SNR), Prob{Bits 6= Bits}

Metrics

Channel

Channel capacity:function of SNR

Encodemsg

Decodemsg

Information theory: model channel and compute capacityI If Rate < Capacity, Prob{Bits 6= Bits} can be as small as desired

Coding theory: encoders and decoders to approach capacity

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 3 / 44

Page 5: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Communications, Information Theory and Coding Theory

BitsModem:bits to signal

Impairments

NoiseModem:signal to bits

Bits

Rate

SNR

Rate, Signal-to-Noise Ratio (SNR), Prob{Bits 6= Bits}

Metrics

Channel

Channel capacity:function of SNR

Encodemsg

Decodemsg

Information theory: model channel and compute capacityI If Rate < Capacity, Prob{Bits 6= Bits} can be as small as desired

Coding theory: encoders and decoders to approach capacity

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 3 / 44

Page 6: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Communications, Information Theory and Coding Theory

BitsModem:bits to signal

Impairments

NoiseModem:signal to bits

Bits

Rate

SNR

Rate, Signal-to-Noise Ratio (SNR), Prob{Bits 6= Bits}

Metrics

Channel

Channel capacity:function of SNR

Encodemsg

Decodemsg

Information theory: model channel and compute capacityI If Rate < Capacity, Prob{Bits 6= Bits} can be as small as desired

Coding theory: encoders and decoders to approach capacity

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 3 / 44

Page 7: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Communications, Information Theory and Coding Theory

BitsModem:bits to signal

Impairments

NoiseModem:signal to bits

Bits

Rate

SNR

Rate, Signal-to-Noise Ratio (SNR), Prob{Bits 6= Bits}

Metrics

Channel

Channel capacity:function of SNR

Encodemsg

Decodemsg

Information theory: model channel and compute capacityI If Rate < Capacity, Prob{Bits 6= Bits} can be as small as desired

Coding theory: encoders and decoders to approach capacity

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 3 / 44

Page 8: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Research in Information Theory

Point-to-pointcommunications

Most capacityproblems solved

Performanceclose tocapacity

Channels withmemory open

Multiterminalcommunications

Most problemsare open

Practical systemsfar from limits

Emerging areas:beyond communications

Models andlimits of learning

Models andlimits for storage

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 4 / 44

Page 9: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

1 Information Theory for Communication SystemsBackground, Current Status and ResearchNoisy Runlength Constrained Channels1-bit Quantized ISI Channels

2 Online Algorithms for Basestation AllocationSecretary problemBasestation allocation and graph matchingPerformance studies and comparison

3 Information Theory Research in IndiaActivitiesIIT Madras

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 5 / 44

Page 10: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Runlength Constraint: No Consecutive 1s

Constrained binary sequence

Sequence of bits with no consecutive 1s

Extension to (d , k)-constrained sequence used in recording channels(Marcus et al ’98)

Represented as a walk in a state diagram

s0 s101

0

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 6 / 44

Page 11: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

How many such sequences?

AN = Number of length-N binary vectors with no consecutive 1s

N = 2: 00, 10, 01 - A2 = 3

N = 3: 000, 001, 010, 100, 101 - A3 = 5

N = 4: 0000, 0001, 0010, 0100, 0101, 1000, 1001, 1010 - A4 = 8

For large N: AN ≈

(√5 + 1

2

)N

1

Nlog2 AN → log2

√5 + 1

2

What happens when such a sequence is transmitted across a channel?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 7 / 44

Page 12: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

How many such sequences?

AN = Number of length-N binary vectors with no consecutive 1s

N = 2: 00, 10, 01 - A2 = 3

N = 3: 000, 001, 010, 100, 101 - A3 = 5

N = 4: 0000, 0001, 0010, 0100, 0101, 1000, 1001, 1010 - A4 = 8

For large N: AN ≈

(√5 + 1

2

)N

1

Nlog2 AN → log2

√5 + 1

2What happens when such a sequence is transmitted across a channel?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 7 / 44

Page 13: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Some Prior Work

Number of constrained sequencesI C. E. Shannon, “A mathematical theory of communication,” Bell Syst.

Tech. J., 1948.

Lower bounds for constrained BSCI E. Zehavi and J. K. Wolf, “On runlength codes,” IEEE Trans. Inf.

Theory, 1988.

Lower and upper boundsI D. M. Arnold, H. A. Loeliger, P. O. Vontobel, A. Kavcic, and W. Zeng,

“Simulation-based computation of information rates for channels withmemory,” IEEE Trans. on Inf. Theory, 2006.

I P. Vontobel and D. Arnold, “An upper bound on the capacity ofchannels with memory and constraint input,” in IEEE ITW 2001.

I Y. Li and G. Han, “Input-constrained erasure channels: Mutualinformation and capacity,” in IEEE ISIT 2014.

Feedback capacity for erasure channel (which serves as upper bound)I O. Sabag, H. H. Permuter, and N. Kashyap, “The feedback capacity of

the binary erasure channel with a no-consecutive-ones inputconstraint,” IEEE Trans. on Inf. Theory, Jan 2016.

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 8 / 44

Page 14: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Binary Erasure Channel (BEC) Model

0

1

01− ε

11− ε

ε

BEC(ε)

Elias ’54Capacity: 1− ε bits per channel use

Bits {0, 1, ?}’s

Every bit gets erased independently with probability ε

Simple model that captures the essence of the problem

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 9 / 44

Page 15: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Noisy Constrained Channels

BEC(ε)XN

no consecutive 1s

Y N

C (ε): Capacity of constrained channel

C (ε) = limN→∞

1

Nmax

pXN :XN∈XN

d,k

I (XN ;Y N)

Capacity of constrained noisy channels: has proven to be difficult tocharacterize or bound even for simple channels

[A. Thangaraj, “Dual Capacity Upper Bounds for Noisy Runlength Constrained

Channels,” presented at IEEE Information Theory Workshop 2016]

Dual bound:C (ε) ≤(1− ε)2 log2(1/β) + ε(1− ε) log2(2− β),

β ∈ (0, 1] solves β2(1−ε) = 1− β

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 10 / 44

Page 16: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Noisy Constrained Channels

BEC(ε)XN

no consecutive 1s

Y N

C (ε): Capacity of constrained channel

C (ε) = limN→∞

1

Nmax

pXN :XN∈XN

d,k

I (XN ;Y N)

Capacity of constrained noisy channels: has proven to be difficult tocharacterize or bound even for simple channels

[A. Thangaraj, “Dual Capacity Upper Bounds for Noisy Runlength Constrained

Channels,” presented at IEEE Information Theory Workshop 2016]

Dual bound:C (ε) ≤(1− ε)2 log2(1/β) + ε(1− ε) log2(2− β),

β ∈ (0, 1] solves β2(1−ε) = 1− β

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 10 / 44

Page 17: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Plot: Binary Erasure Channel

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

ε

Cap

acit

yB

oun

ds

Dual bound, µ = 1

Dual bound, µ = 2

Feedback bound

Achievable

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

ε

Cap

acit

yB

oun

ds

Dual bound, µ = 1

Dual bound, µ = 2

Feedback bound

Achievable

Dual bound: This work, Feedback bound: Sabag et al ’16, Achievable: Arnold et al ’06

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 11 / 44

Page 18: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Remarks

Main technique: Dual Capacity Upper Bound

Dual capacity bound is useful in scenarios where characterizing theexact capacity is difficult.

I Related work: A. Thangaraj, G. Kramer, and G. Bocherer, “Capacityupper bounds for discrete-time amplitude-constrained AWGNchannels,” in IEEE ISIT 2015, https://arxiv.org/abs/1511.08742.

In my work, dual capacity bound was used for noisy constrainedchannels

I (d , k)-constrained binary symmetric channel and binary-input Gaussianchannels

Future workI Dual bounds for other channels with memory, such as the Inter Symbol

Interference (ISI) channel

Additional details: https://arxiv.org/abs/1609.00189

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 12 / 44

Page 19: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

1 Information Theory for Communication SystemsBackground, Current Status and ResearchNoisy Runlength Constrained Channels1-bit Quantized ISI Channels

2 Online Algorithms for Basestation AllocationSecretary problemBasestation allocation and graph matchingPerformance studies and comparison

3 Information Theory Research in IndiaActivitiesIIT Madras

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 13 / 44

Page 20: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

ISI Channel with 1-bit Quantized Output

Channelh = [h0, h1, . . . , hL−1]

Yn = h0Xn + h1Xn−1 + · · ·+ hL−1Xn−L+1

Xn ∈ R+

Yn

Zn ∼ N(0, σ2)

x

Q(x)

+1

-1

Rn Q(Rn)

Continuous input, 1-bit quantized output (why this model?)

Average power constraint: E [‖X‖2] ≤ NP

Problem: How to signal?

ISI channel with continuous output: Inner equalizer + Outer code

ISI channel with 1-bit output: How to equalize?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 14 / 44

Page 21: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Prior Work vs This Work

Why ISI with continuous input, quantized output?I Millimeter-wave [Sun et al ’14], SERDES [Harwood et al ’07]

Prior workI ISI with continuous input/output: [Hirt-Massey ’88],

[Shamai-Ozarow-Wyner ’91], [Shamai-Laroia ’96]I Non-ISI quantized output: [Singh-Dabeer-Madhow ’09]I ISI, optimize quantizer at receiver: [Zeitler-Singer-Kramer ’12]I Millimeter wave, 1-bit quantized output: [Mo-Heath ’14]

This work: Equalizers (transmitter block precoding) for ISI channelswith continuous input, 1-bit quantized output

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 15 / 44

Page 22: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Threshold Precoder and Noisefree Approximation

δ-threshold precoding for 1-bit quantized ISI channel

Channelh = [h0, h1, . . . , hL−1]

Xn ∈ R0 ≤ n < N

+Yn

|Yn| ≥ δ

Zn ∼ N(0, σ2)

x

Q(x)+1

-1

Rn Q(Rn)

Choose {Xn} with threshold constraint: |Yn| = |∑L−1

i=0 hiXn−i | ≥ δ

Noisefree approximation of threshold-precoded channel

Channelh = [h0, h1, . . . , hL−1]

Xn ∈ R0 ≤ n < N x

Q(x)+1

-1

Yn

|Yn| ≥ δSn

Approximation error: P(Sn 6= Q(Rn)) ≤ Q(δ/σ)

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 16 / 44

Page 23: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Threshold Precoder and Noisefree Approximation

δ-threshold precoding for 1-bit quantized ISI channel

Channelh = [h0, h1, . . . , hL−1]

Xn ∈ R0 ≤ n < N

+Yn

|Yn| ≥ δ

Zn ∼ N(0, σ2)

x

Q(x)+1

-1

Rn Q(Rn)

Choose {Xn} with threshold constraint: |Yn| = |∑L−1

i=0 hiXn−i | ≥ δ

Noisefree approximation of threshold-precoded channel

Channelh = [h0, h1, . . . , hL−1]

Xn ∈ R0 ≤ n < N x

Q(x)+1

-1

Yn

|Yn| ≥ δSn

Approximation error: P(Sn 6= Q(Rn)) ≤ Q(δ/σ)

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 16 / 44

Page 24: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Threshold Precoder and Noisefree Approximation

δ-threshold precoding for 1-bit quantized ISI channel

Channelh = [h0, h1, . . . , hL−1]

Xn ∈ R0 ≤ n < N

+Yn

|Yn| ≥ δ

Zn ∼ N(0, σ2)

x

Q(x)+1

-1

Rn Q(Rn)

Choose {Xn} with threshold constraint: |Yn| = |∑L−1

i=0 hiXn−i | ≥ δ

Noisefree approximation of threshold-precoded channel

Channelh = [h0, h1, . . . , hL−1]

Xn ∈ R0 ≤ n < N x

Q(x)+1

-1

Yn

|Yn| ≥ δSn

Approximation error: P(Sn 6= Q(Rn)) ≤ Q(δ/σ)

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 16 / 44

Page 25: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Signaling Method: Precoder + Outer Code

Precoder + Outer code for the noisy channel

Enc

E[‖X‖2]≤NP

|Yn| ≥ δOuterEnc

Rout ≈ 1− H2(Q(δ/σ))

Sn

Rin = H(SN)N

Channelh

Xn+

Yn

Zn ∼ N(0, σ2)

Q(·)Rn

OuterDec

Q(Rn)

[R. Ganti, A. Thangaraj and A. Mondal, ”Approximation of Capacity for ISI Channels

with One-bit Output Quantization,” IEEE ISIT 2015.]

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 17 / 44

Page 26: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Numerical Example I: Channel [1 ε]

0.7 0.8 0.9 10

0.2

0.4

0.6

0.8

1

Normalised energy: P/δ2

bit

sp

erch

ann

elu

se

Gibbs ε = 0.2ZF Markov ε = 0.2

Gibbs ε = 0.8ZF Markov ε = 0.8

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 18 / 44

Page 27: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Numerical Example II: Channel [-0.3, 1, 0.6]

0.55 0.6 0.65 0.7 0.75 0.8 0.850

0.2

0.4

0.6

0.8

1

Normalised energy: P/δ2

bit

sp

erch

ann

elu

se

GibbsZF Markov

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 19 / 44

Page 28: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Remarks

Signaling method for ISI channels with 1-bit ouputI Precoder + Outer error control code

Design of precoderI Threshold precodingI Use of noisefree approximation, Gibbs sampling: optimal

Achievable schemesI Zero-forcing Markov: good practical method

FutureI Several applications in 5G and mmwaveI Extensions to MIMO

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 20 / 44

Page 29: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

1 Information Theory for Communication SystemsBackground, Current Status and ResearchNoisy Runlength Constrained Channels1-bit Quantized ISI Channels

2 Online Algorithms for Basestation AllocationSecretary problemBasestation allocation and graph matchingPerformance studies and comparison

3 Information Theory Research in IndiaActivitiesIIT Madras

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 21 / 44

Page 30: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

The Secretary Problem: “Online” Interview

Goal: Select candidate with best scoreor weight(C1) Interview one at a time(C2) Decide online: select or reject(C3) Rejected cannot be called back(C4) No more interviews after selection

n candidates

7

w1 w2

· · ·

wi−1

3

wi

End of interview

Central question: How good is online compared to offline?

Offline interview always hires applicant with maximum score. Can anonline interview do the same?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 22 / 44

Page 31: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

The Secretary Problem: “Online” Interview

w1

Goal: Select candidate with best scoreor weight(C1) Interview one at a time(C2) Decide online: select or reject(C3) Rejected cannot be called back(C4) No more interviews after selection

n candidates

7

w1 w2

· · ·

wi−1

3

wi

End of interview

Central question: How good is online compared to offline?

Offline interview always hires applicant with maximum score. Can anonline interview do the same?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 22 / 44

Page 32: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

The Secretary Problem: “Online” Interview

Goal: Select candidate with best scoreor weight(C1) Interview one at a time(C2) Decide online: select or reject(C3) Rejected cannot be called back(C4) No more interviews after selection

n candidates7

w1

w2

· · ·

wi−1

3

wi

End of interview

Central question: How good is online compared to offline?

Offline interview always hires applicant with maximum score. Can anonline interview do the same?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 22 / 44

Page 33: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

The Secretary Problem: “Online” Interview

Goal: Select candidate with best scoreor weight(C1) Interview one at a time(C2) Decide online: select or reject(C3) Rejected cannot be called back(C4) No more interviews after selection

n candidates7

w1

w2

w2

· · ·

wi−1

3

wi

End of interview

Central question: How good is online compared to offline?

Offline interview always hires applicant with maximum score. Can anonline interview do the same?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 22 / 44

Page 34: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

The Secretary Problem: “Online” Interview

Goal: Select candidate with best scoreor weight(C1) Interview one at a time(C2) Decide online: select or reject(C3) Rejected cannot be called back(C4) No more interviews after selection

n candidates7

w1 w2

· · ·

wi−1

3

wi

End of interview

Central question: How good is online compared to offline?

Offline interview always hires applicant with maximum score. Can anonline interview do the same?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 22 / 44

Page 35: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

The Secretary Problem: “Online” Interview

Goal: Select candidate with best scoreor weight(C1) Interview one at a time(C2) Decide online: select or reject(C3) Rejected cannot be called back(C4) No more interviews after selection

n candidates7

w1 w2

· · ·

wi−1

3

wi

End of interview

Central question: How good is online compared to offline?

Offline interview always hires applicant with maximum score. Can anonline interview do the same?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 22 / 44

Page 36: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

The Secretary Problem: “Online” Interview

Goal: Select candidate with best scoreor weight(C1) Interview one at a time(C2) Decide online: select or reject(C3) Rejected cannot be called back(C4) No more interviews after selection

n candidates7

w1 w2

· · ·

wi−1

wi

3

wi

End of interview

Central question: How good is online compared to offline?

Offline interview always hires applicant with maximum score. Can anonline interview do the same?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 22 / 44

Page 37: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

The Secretary Problem: “Online” Interview

Goal: Select candidate with best scoreor weight(C1) Interview one at a time(C2) Decide online: select or reject(C3) Rejected cannot be called back(C4) No more interviews after selection

n candidates7

w1 w2

· · ·

wi−1

3

wi

End of interview

Central question: How good is online compared to offline?

Offline interview always hires applicant with maximum score. Can anonline interview do the same?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 22 / 44

Page 38: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

The Secretary Problem: “Online” Interview

Goal: Select candidate with best scoreor weight(C1) Interview one at a time(C2) Decide online: select or reject(C3) Rejected cannot be called back(C4) No more interviews after selection

n candidates7

w1 w2

· · ·

wi−1

3

wi

End of interview

Central question: How good is online compared to offline?

Offline interview always hires applicant with maximum score. Can anonline interview do the same?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 22 / 44

Page 39: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Model for Candidates and Order of Interview

n candidates

wi1 wi2

· · ·

win

Fixed n and weights: w1, w2, . . ., wn

Order of interview is uniformly random{i1, i2, . . . , in}: permutation of {1, 2, . . . , n}, uniformly random

Offline (knows permutation) vs Online

In how many of the n! permutations, can an online interview select thecandidate with maximum weight?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 23 / 44

Page 40: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Model for Candidates and Order of Interview

n candidates

wi1

wi2

· · ·

win

Fixed n and weights: w1, w2, . . ., wn

Order of interview is uniformly random

{i1, i2, . . . , in}: permutation of {1, 2, . . . , n}, uniformly random

Offline (knows permutation) vs Online

In how many of the n! permutations, can an online interview select thecandidate with maximum weight?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 23 / 44

Page 41: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Model for Candidates and Order of Interview

n candidates

wi1 wi2

· · ·

win

Fixed n and weights: w1, w2, . . ., wn

Order of interview is uniformly random

{i1, i2, . . . , in}: permutation of {1, 2, . . . , n}, uniformly random

Offline (knows permutation) vs Online

In how many of the n! permutations, can an online interview select thecandidate with maximum weight?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 23 / 44

Page 42: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Model for Candidates and Order of Interview

n candidates

wi1 wi2

· · ·

win

Fixed n and weights: w1, w2, . . ., wn

Order of interview is uniformly random{i1, i2, . . . , in}: permutation of {1, 2, . . . , n}, uniformly random

Offline (knows permutation) vs Online

In how many of the n! permutations, can an online interview select thecandidate with maximum weight?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 23 / 44

Page 43: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Model for Candidates and Order of Interview

n candidates

wi1 wi2

· · ·

win

Fixed n and weights: w1, w2, . . ., wn

Order of interview is uniformly random{i1, i2, . . . , in}: permutation of {1, 2, . . . , n}, uniformly random

Offline (knows permutation) vs Online

In how many of the n! permutations, can an online interview select thecandidate with maximum weight?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 23 / 44

Page 44: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Online Algorithm A(r) for the Secretary Problem

n candidates

7

wi1 wi2

· · ·

wir wir+1

· · ·

wir+j−1

3

wir+j

Reject first r candidates

Find best weight, so far: T = max(wi1 ,wi2 , . . . ,wir )

From subsequent candidates, select first one with weight ≥ T

If there is none, select last candidate

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 24 / 44

Page 45: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Online Algorithm A(r) for the Secretary Problem

n candidates7

wi1 wi2

· · ·

wir

wir+1

· · ·

wir+j−1

3

wir+j

Reject first r candidates

Find best weight, so far: T = max(wi1 ,wi2 , . . . ,wir )

From subsequent candidates, select first one with weight ≥ T

If there is none, select last candidate

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 24 / 44

Page 46: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Online Algorithm A(r) for the Secretary Problem

n candidates7

wi1 wi2

· · ·

wir

wir+1

· · ·

wir+j−1

3

wir+j

Reject first r candidates

Find best weight, so far: T = max(wi1 ,wi2 , . . . ,wir )

From subsequent candidates, select first one with weight ≥ T

If there is none, select last candidate

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 24 / 44

Page 47: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Online Algorithm A(r) for the Secretary Problem

n candidates7

wi1 wi2

· · ·

wir

wir+1

· · ·

wir+j−1

3

wir+j

Reject first r candidates

Find best weight, so far: T = max(wi1 ,wi2 , . . . ,wir )

From subsequent candidates, select first one with weight ≥ T

If there is none, select last candidate

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 24 / 44

Page 48: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Online Algorithm A(r) for the Secretary Problem

n candidates7

wi1 wi2

· · ·

wir wir+1

· · ·

wir+j−1

3

wir+j

Reject first r candidates

Find best weight, so far: T = max(wi1 ,wi2 , . . . ,wir )

From subsequent candidates, select first one with weight ≥ T

If there is none, select last candidate

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 24 / 44

Page 49: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Online Algorithm A(r) for the Secretary Problem

n candidates7

wi1 wi2

· · ·

wir wir+1

· · ·

wir+j−1

3

wir+j

Reject first r candidates

Find best weight, so far: T = max(wi1 ,wi2 , . . . ,wir )

From subsequent candidates, select first one with weight ≥ T

If there is none, select last candidate

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 24 / 44

Page 50: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Offline versus Online A(r)

Offline algorithmI Knows entire sequence of weights ahead of time: wi1 , wi2 , . . ., winI Suppose m-th candidate has maximum weight wimI Reject candidates 1 to m − 1I Select candidate m

Online algorithm A(r)I Surprise! In nearly 36% of the cases, A(r) will select candidate mI Best choice for r : r = n/eI Fraction of successes is independent of n

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 25 / 44

Page 51: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Analysis of Online Algorithm A(r)

Max weight candidate comes before r : not selected

1 2 · · · r r + 1 · · · n

Max weight candidate at r + 1: selected always

1 2 · · · r r + 1 · · · n

Max weight candidate at r + j : selected with some probability

1 2 · · · r r + 1 · · · r + j n

37

1 2 · · · r r + 1 · · · r + j n

3

Prob: r/(r + j − 1)

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 26 / 44

Page 52: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Analysis of Online Algorithm A(r)

Max weight candidate comes before r : not selected

1 2 · · · r r + 1 · · · n

Max weight candidate at r + 1: selected always

1 2 · · · r r + 1 · · · n

Max weight candidate at r + j : selected with some probability

1 2 · · · r r + 1 · · · r + j n

37

1 2 · · · r r + 1 · · · r + j n

3

Prob: r/(r + j − 1)

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 26 / 44

Page 53: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Analysis of Online Algorithm A(r)

Max weight candidate comes before r : not selected

1 2 · · · r r + 1 · · · n

Max weight candidate at r + 1: selected always

1 2 · · · r r + 1 · · · n

Max weight candidate at r + j : selected with some probability

1 2 · · · r r + 1 · · · r + j n

37

1 2 · · · r r + 1 · · · r + j n

3

Prob: r/(r + j − 1)

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 26 / 44

Page 54: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Probability Calculation

Pr{Selecting Top Applicant}

=n−r∑j=1

Pr{Top applicant in r + j} Pr{Best in first r + j − 1 within r}

=n−r∑j=1

1

n

r

r + j − 1=

r

n

n−1∑i=r

1

i

≈ r

nlog

n

r, maximised by choosing r = n/e

=1

e≈ 0.36.

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 27 / 44

Page 55: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Basestation Allocation Problem

1 2

3 4

basestation

user

1

w1 w1

w1 w1

2

w2w2

w2 w2

34

5

6

7

89

Basestation allocation

Allot n users to one out of m basestationswi : weight of user i to all basestations

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 28 / 44

Page 56: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Basestation Allocation Problem

1 2

3 4

basestation

user

1

w1 w1

w1 w1

2

w2w2

w2 w2

34

5

6

7

89

Basestation allocation

Allot n users to one out of m basestationswi : weight of user i to all basestations

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 28 / 44

Page 57: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Basestation Allocation Problem

1 2

3 4

basestation

user

1

w1 w1

w1 w1

2

w2w2

w2 w2

34

5

6

7

89

Basestation allocation

Allot n users to one out of m basestationswi : weight of user i to all basestations

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 28 / 44

Page 58: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Basestation Allocation Problem

1 2

3 4

basestation

user

1

w1 w1

w1 w1

2

w2w2

w2 w2

34

5

6

7

89

Basestation allocation

Allot n users to one out of m basestationswi : weight of user i to all basestations

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 28 / 44

Page 59: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Basestation Allocation Problem

1 2

3 4

basestation

user

1

w1 w1

w1 w1

2

w2w2

w2 w2

34

5

6

7

89

Basestation allocation

Allot n users to one out of m basestationswi : weight of user i to all basestations

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 28 / 44

Page 60: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Time-sharing Utility at the Basestation

1 2

3 4

basestation (BS)

user

1

2

34

5

6

7

8 9

w1

w2

w1/2

w2/2

w1/3

w2/3

w7/3

w3/3w4/3

w5/3

w6/2

w8/2 w9

Mj : users allotted to BS j

dj = |Mj |

Time-sharing utility: Total rate from all basestations

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 29 / 44

Page 61: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Time-sharing Utility at the Basestation

1 2

3 4

basestation (BS)

user

1

2

34

5

6

7

8 9

w1

w2

w1/2

w2/2

w1/3

w2/3

w7/3

w3/3w4/3

w5/3

w6/2

w8/2 w9

Mj : users allotted to BS j

dj = |Mj |

Time-sharing utility: Total rate from all basestations

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 29 / 44

Page 62: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Time-sharing Utility at the Basestation

1 2

3 4

basestation (BS)

user

1

2

34

5

6

7

8 9

w1

w2

w1/2

w2/2

w1/3

w2/3

w7/3

w3/3w4/3

w5/3

w6/2

w8/2 w9

Mj : users allotted to BS j

dj = |Mj |

Time-sharing utility: Total rate from all basestations

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 29 / 44

Page 63: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Time-sharing Utility at the Basestation

1 2

3 4

basestation (BS)

user

1

2

34

5

6

7

8 9

w1

w2

w1/2

w2/2

w1/3

w2/3

w7/3

w3/3w4/3

w5/3

w6/2

w8/2 w9

Mj : users allotted to BS j

dj = |Mj |

Time-sharing utility: Total rate from all basestations

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 29 / 44

Page 64: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Time-sharing Utility at the Basestation

1 2

3 4

basestation (BS)

user

1

2

34

5

6

7

8 9

w1

w2

w1/2

w2/2

w1/3

w2/3

w7/3

w3/3w4/3

w5/3

w6/2

w8/2 w9

Mj : users allotted to BS j

dj = |Mj |

Time-sharing utility: Total rate from all basestations

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 29 / 44

Page 65: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Time-sharing Utility at the Basestation

1 2

3 4

basestation (BS)

user

1

2

34

5

6

7

8 9

w1

w2

w1/2

w2/2

w1/3

w2/3

w7/3

w3/3w4/3

w5/3

w6/2

w8/2 w9

Mj : users allotted to BS j

dj = |Mj |

Time-sharing utility: Total rate from all basestations

w1 + w2 + w7

3+

w3 + w4 + w5

3+

w6 + w8

2+ w9

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 29 / 44

Page 66: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Time-sharing Utility at the Basestation

1 2

3 4

basestation (BS)

user

1

2

34

5

6

7

8 9

w1

w2

w1/2

w2/2

w1/3

w2/3

w7/3

w3/3w4/3

w5/3

w6/2

w8/2 w9

Mj : users allotted to BS j

dj = |Mj |

Time-sharing utility: Total rate from all basestations(∑i∈M1

wi

d1

)+

(∑i∈M2

wi

d2

)+ · · ·

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 29 / 44

Page 67: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Basestation Allocation: Offline vs Online

Goal of allocation: maximize time-sharing utility

Find M1, M2, . . . such that

(∑i∈M1

wi

d1

)+

(∑i∈M2

wi

d2

)+ · · · is

maximized.

OfflineI sequence of user weights {wi} is known before arrivalI what is the best offline algorithm?

OnlineI weight of i-th user wi is revealed at time iI user i needs to be allotted a basestation at time iI allotment of user i cannot be changed later

Main question: Fix n and weights w1,w2, . . . ,wn

Users arrive in uniformly random order. How good are online algorithmscompared to offline?

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 30 / 44

Page 68: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Optimal Offline Algorithm

How to maximize time-sharing utility?(∑i∈M1

wi

d1

)+

(∑i∈M2

wi

d2

)+ · · ·

Intuition: Allot high weight users to basestations with low d

Optimal offline allocation: n users to m basestations

Find top m − 1 users

Allot them individually to m − 1 basestations

Assign other n −m + 1 users to basestation m

More details and proof:http://arxiv.org/abs/1308.1212

1

2

3

4

5

6

7

8

9

1

2

3

4

Best utility = w1 + w2 + w3 +1

n − 3(w4 + · · ·+ wn)

Assume: w1 > w2 > · · · > wn, m = 4

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 31 / 44

Page 69: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Optimal Offline Algorithm

Optimal offline allocation: n users to m basestations

Find top m − 1 users

Allot them individually to m − 1 basestations

Assign other n −m + 1 users to basestation m

More details and proof:http://arxiv.org/abs/1308.1212

1

2

3

4

5

6

7

8

9

1

2

3

4

Best utility = w1 + w2 + w3 +1

n − 3(w4 + · · ·+ wn)

Assume: w1 > w2 > · · · > wn, m = 4

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 31 / 44

Page 70: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Optimal Offline Algorithm

Optimal offline allocation: n users to m basestations

Find top m − 1 users

Allot them individually to m − 1 basestations

Assign other n −m + 1 users to basestation m

More details and proof:http://arxiv.org/abs/1308.1212

1

2

3

4

5

6

7

8

9

1

2

3

4

Best utility = w1 + w2 + w3 +1

n − 3(w4 + · · ·+ wn)

Assume: w1 > w2 > · · · > wn, m = 4

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 31 / 44

Page 71: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Optimal Offline Algorithm

Optimal offline allocation: n users to m basestations

Find top m − 1 users

Allot them individually to m − 1 basestations

Assign other n −m + 1 users to basestation m

More details and proof:http://arxiv.org/abs/1308.1212

1

2

3

4

5

6

7

8

9

1

2

3

4

Best utility = w1 + w2 + w3 +1

n − 3(w4 + · · ·+ wn)

Assume: w1 > w2 > · · · > wn, m = 4

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 31 / 44

Page 72: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

From Secretary Problem to Basestation Allocation

Optimal offline allocation: m = 2 basestations

Find max weight user

Assign max weight user to basestation 1

Assign other n − 1 users to basestation 2

1

2

3

4

5

6

1

2

3

7

Secretary problem: select candidate with max weight online

Basestation allocation: identify max-weight user online

Online algorithm for m = 2 basestations

Run the algorithm A(r) for the secretary problem to select max weightuser for basestation 1 (change: do not stop after selection)Allot all rejected users to basestation 2

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 32 / 44

Page 73: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

From Secretary Problem to Basestation Allocation

Optimal offline allocation: m = 2 basestations

Find max weight user

Assign max weight user to basestation 1

Assign other n − 1 users to basestation 2

1

2

3

4

5

6

1

2

3

7

Secretary problem: select candidate with max weight online

Basestation allocation: identify max-weight user online

Online algorithm for m = 2 basestations

Run the algorithm A(r) for the secretary problem to select max weightuser for basestation 1 (change: do not stop after selection)Allot all rejected users to basestation 2

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 32 / 44

Page 74: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Online Algorithm Am(r) for m Basestations

Algorithm Am(r): Extension of A(r) to find top m − 1 users

1 Allocate the first r test users to basestation m, and compute the(m − 1)-th best weight, denoted T , among the first r users.

2 For i > r , if weight ≤ T ,1 Allocate user i to basestation m.

3 For i > r , if weight > T ,1 Allocate user i to basestations 1 though m− 1 in a round robin fashion.2 Update T to be the (m − 1)-th best weight seen so far.

How good is Am(r)?

For 2 ≤ m ≤ 20, Optimized r : ≈ 0.22n, and

ExpectedOnline utility

Offline utility≥ 0.46

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 33 / 44

Page 75: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Sketch of Computations

Basestation m: used for dumping “bad” users

Basestation 1 through m − 1: “good” users

Main computation: Pr{S = d}, where S : number of good users

P(S = d) =∑

r+1≤i1<i2<···<id≤n

∏i∈{i1,i2,...,id}

m − 1

i

r+1≤i≤ni /∈{i1,i2,...,id}

1− m − 1

i

→( rn

)m−1 1

d!

((m − 1) loge

n

r

)dFact about permutations that is useful

I π: uniformly random permutation, bi = |{j : 1 ≤ j ≤ i , π(j) ≥ π(i)}|I bi ∼ unif{1, 2, . . . , i}, independent of bi ′ for i ′ 6= iI More details in http://arxiv.org/abs/1308.1212

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 34 / 44

Page 76: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

More Practical Scenario: Arbitrary Weights

1 2

3 4

1

2

3

4

5

6

7

89

basestation

user

w61 w62

w63 w64

Arbitrary Weights

Each user has possibly different weights to each basestation:wij , 1 ≤ j ≤ m

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 35 / 44

Page 77: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Max Weight Matching Problem

Bipartite graph: GI Edge e = (i , j) has weight we = wij

Matching M in G :Set of non-intersecting edges

I Weight of matching = sum of weights ofedges in matching

1

2

3

4

5

6

7

8

9

1

2

3

4

Max Weight Matching Problem

Find matching with maximum weightLike “Secretary problem” for equal weights case, online Max WeightMatching is used for basestation allocation in the arbitrary weights case.

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 36 / 44

Page 78: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Max Weight Matching Problem

Bipartite graph: GI Edge e = (i , j) has weight we = wij

Matching M in G :Set of non-intersecting edges

I Weight of matching = sum of weights ofedges in matching

1

2

3

4

5

6

7

8

9

1

2

3

4

Max Weight Matching Problem

Find matching with maximum weightLike “Secretary problem” for equal weights case, online Max WeightMatching is used for basestation allocation in the arbitrary weights case.

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 36 / 44

Page 79: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Max Weight Matching Problem

Bipartite graph: GI Edge e = (i , j) has weight we = wij

Matching M in G :Set of non-intersecting edges

I Weight of matching = sum of weights ofedges in matching

1

2

3

4

5

6

7

8

9

1

2

3

4

Max Weight Matching Problem

Find matching with maximum weightLike “Secretary problem” for equal weights case, online Max WeightMatching is used for basestation allocation in the arbitrary weights case.

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 36 / 44

Page 80: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Arbitrary Weights: Online Algorithm

Idea: Randomized online allocation algorithm using an online bipartitegraph matching algorithm AM

HideandSeek Algorithm

Let j0 ∈ {1, 2, . . . ,m} be chosen uniformly at random.G−j0 : G with basestation j0 deleted.

Use online algorithm AM to find the max-weight matching in thegraph G−j0 . Denote this matching as AM(G−j0).

Allocate the users in the matching AM(G−j0) to the m − 1basestations other than j0, while all other users are allocated tobasestation j0 itself.

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 37 / 44

Page 81: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Arbitrary Weights: Online versus Offline

Theorem

With randomized input the HideAndSeek algorithm has

Competitive ratio , ExpectedOffline utility

Online utility≤ 8m

m − 1.

Proof.

E [weight(AM(G−j0))] ≥ m − 1

mE [weight(AM(G ))],

since j0 is chosen uniformly at random.Since AM is within 1/8 of the offline algorithm, the result follows.

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 38 / 44

Page 82: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Simulation Results: Secretary-problem-based Algorithm

1000 2000 3000 4000 5000 6000 7000 8000 9000 100001

1.2

1.4

1.6

1.8

2

2.2

2.4

2.6

Nr of Users

Co

mp

etitive

Ra

tiom=10 identical basestations

r=n/5

r= n/3

r=n/e

max−weight

max−weight corr users

K−sec corr user

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 39 / 44

Page 83: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Simulation Results: HideAndSeek versus Max-weight

1000 2000 3000 4000 5000 6000 7000 8000 9000 100001

2

3

4

5

6

7

8

9

10

Nr of Users

Co

mp

etitive

Ra

tio

Up

pe

r B

ou

nd

m=10 basestations

HideAndSeek corr usermax−weight corr usermax−weight iid modelHideAndSeek iid model

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 40 / 44

Page 84: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Summary Remarks

Online algorithmsI Good candidates for practical use.I Constant competitive ratio under random user arrivals.I Robust to statistical misspecification.

Max-weight algorithms versus HideandSeekI Max-weight association has average-case competitive ratio n (can be

shown).I The competitive ratio of HideAndSeek algorithm is 8m/(m − 1).I In practice, under correlated user weights, HideandSeek outperforms

max-weight.

More details: http://arxiv.org/abs/1308.1212

Extension: K. Thekumparampil, A. Thangaraj and R. Vaze,”Combinatorial Resource Allocation Using Sub-modularity ofWaterfilling,” IEEE Transactions on Wireless Communications, vol.15, no. 1, pp. 206-216, January 2016.

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 41 / 44

Page 85: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

1 Information Theory for Communication SystemsBackground, Current Status and ResearchNoisy Runlength Constrained Channels1-bit Quantized ISI Channels

2 Online Algorithms for Basestation AllocationSecretary problemBasestation allocation and graph matchingPerformance studies and comparison

3 Information Theory Research in IndiaActivitiesIIT Madras

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 42 / 44

Page 86: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Conferences and Schools

National Conference in Communications (NCC)I Premier annual communications conference in the countryI NCC 2017 is at IIT Madras

F http://www.ncc2017.org/F Submission deadline: Oct 31, 2016

Joint Telematics Group/IEEE Information Theory Society SummerSchool

I Held annually at IIT Madras, IISc Bangalore or IIT BombayI 8-hour courses on two or three topicsI 2016L http://www.ece.iisc.ernet.in/ jtg/2016/

Signal Processing and Communications Conference (SPCOM)I Held once in two years at IISc BangaloreI This year’s conference website: http://ece.iisc.ernet.in/ spcom/2016/

International Conference on Communication Systems and Networks(COMSNETS): http://www.comsnets.org/Premier international conferences in Information Theory

I IEEE International Symposium on Information theoryI IEEE Information Theory Workshop

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 43 / 44

Page 87: An Invitation to Research in Information Theorysilver.nitt.edu/~esgopi/Guestlecture/2016-10-10-NITTrichyResearchIT.pdf · An Invitation to Research in Information Theory Andrew Thangaraj

Opportunities at IIT Madras

MS/PhD programmesI https://research.iitm.ac.in/I Deadline for Jan-May 2017: October 24I Final year UG, PG students can apply

Summer fellowshipsI https://sfp.iitm.ac.in/

MoU between NITT and IITMI Facilitates student exchange and joint researchI Top 10% of 3rd year undergraduate students

F can apply for direct PhD programme at the end of 3rd yearF if admitted, can spend final year at IIT Madras and then continue on to

PhD

Andrew Thangaraj (IIT Madras) Research in Information Theory October 10, 2016 44 / 44