introduction ling 572 fei xia week 1: 1/3/06. outline course overview problems and methods...

49
Introduction LING 572 Fei Xia Week 1: 1/3/06

Post on 21-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Introduction

LING 572

Fei Xia

Week 1: 1/3/06

Page 2: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Outline

• Course overview

• Problems and methods

• Mathematical foundation– Probability theory– Information theory

Page 3: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Course overview

Page 4: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Course objective

• Focus on statistical methods that produce state-of-the-art results

• Questions: for each algorithm– How the algorithm works: input, output, steps– What kind of tasks an algorithm can be applied to?– How much data is needed?

• Labeled data• Unlabeled data

Page 5: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

General info

• Course website: – Syllabus (incl. slides and papers): updated every week.– Message board– ESubmit

• Office hour: W: 3-5pm.

• Prerequisites: – Ling570 and Ling571.– Programming: C, C++, or Java, Perl is a plus.– Introduction to probability and statistics

Page 6: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Expectations

• Reading: – Papers are online: who don’t have access to printers?– Reference book: Manning & Schutze (MS)– Finish reading before class. Bring your questions to

class.

• Grade:– Homework (3): 30%– Project (6 parts): 60%– Class participation: 10%– No quizzes, exams

Page 7: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Assignments

Hw1: FSA and HMM

Hw2: DT, DL, and TBL.

Hw3: Boosting

No coding Bring the finished assignments to class.

Page 8: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

ProjectP1: Method 1 (Baseline): Trigram P2: Method 2: TBLP3: Method 3: MaxEntP4: Method 4: choose one of four tasks.P5: PresentationP6: Final report

Methods 1-3 are supervised methods.Method 4: bagging, boosting, semi-supervised learning, or system combination.

P1 is an individual task, P2-P6 are group tasks.A group should have no more than three people.

Use ESubmit Need to use others’ code and write your own code.

Page 9: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Summary of Ling570

• Overview: corpora, evaluation• Tokenization• Morphological analysis• POS tagging• Shallow parsing• N-grams and smoothing• WSD• NE tagging• HMM

Page 10: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Summary of Ling571

• Parsing

• Semantics

• Discourse

• Dialogue

• Natural language generation (NLG)

• Machine translation (MT)

Page 11: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

570/571 vs. 572

• 572 focuses more on statistical approaches.

• 570/571 are organized by tasks; 572 is organized by learning methods.

• I assume that you know– The basics of each task: POS tagging, parsing, …– The basic concepts: PCFG, entropy, … – Some learning methods: HMM, FSA, …

Page 12: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

An example

• 570/571:– POS tagging: HMM– Parsing: PCFG– MT: Model 1-4 training

• 572:– HMM: forward-backward algorithm– PCFG: inside-outside algorithm– MT: EM algorithm All special cases of EM algorithm, one method of

unsupervised learning.

Page 13: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Course layout

• Supervised methods– Decision tree– Decision list– Transformation-based learning (TBL)– Bagging– Boosting– Maximum Entropy (MaxEnt)

Page 14: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Course layout (cont)

• Semi-supervised methods– Self-training– Co-training

• Unsupervised methods– EM algorithm

• Forward-backward algorithm• Inside-outside algorithm• EM for PM models

Page 15: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Outline

• Course overview

• Problems and methods

• Mathematical foundation– Probability theory– Information theory

Page 16: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Problems and methods

Page 17: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Types of ML problems

• Classification problem• Estimation problem• Clustering• Discovery• …

A learning method can be applied to one or more types of ML problems.

We will focus on the classification problem.

Page 18: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Classification problem

• Given a set of classes and data x, decide which class x belongs to.

• Labeled data:– (xi, yi) is a set of labeled data.

– xi is a list of attribute values.

– yi is a member of a pre-defined set of classes.

Page 19: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Examples of classification problem

• Disambiguation:– Document classification– POS tagging– WSD– PP attachment given a set of other phrases

• Segmentation:– Tokenization / Word segmentation– NP Chunking

Page 20: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Learning methods

• Modeling: represent the problem as a formula and decompose the formula into a function of parameters

• Training stage: estimate the parameters

• Test (decoding) stage: find the answer given the parameters

Page 21: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Modeling

• Joint vs. conditional models:– P(data, model)– P(model | data)– P(data | model)

• Decomposition: – Which variable conditions on which variable? – What independent assumptions?

Page 22: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

An example of different modeling

)|(*)|(*)(

),|(*)|(*)(),,(

BCPABPAP

BACPABPAPCBAP

)|(*)|(*)(

),|(*)|(*)(),,(

BAPBCPBP

CBAPBCPBPCBAP

Page 23: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Training

• Objective functions:– Maximize likelihood:

– Minimize error rate– Maximum entropy– ….

• Supervised, semi-supervised, unsupervised:– Ex: Maximize likelihood

• Supervised: simple counting• Unsupervised: EM

)|(maxargˆ

dataPML

Page 24: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Decoding

• DP algorithm– CYK for PCFG– Viterbi for HMM– …

• Pruning:– TopN: keep topN hyps at each node.– Beam: keep hyps whose weights >= beam *

max_weight– Threshold: keep hyps whose weights >= threshold– …

Page 25: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Outline

• Course overview

• Problems and methods

• Mathematical foundation– Probability theory– Information theory

Page 26: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Probability Theory

Page 27: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Probability theory

• Sample space, event, event space

• Random variable and random vector

• Conditional probability, joint probability, marginal probability (prior)

Page 28: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Sample space, event, event space

• Sample space (Ω): a collection of basic outcomes. – Ex: toss a coin twice: {HH, HT, TH, TT}

• Event: an event is a subset of Ω.– Ex: {HT, TH}

• Event space (2Ω): the set of all possible events.

Page 29: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Random variable

• The outcome of an experiment need not be a number.

• We often want to represent outcomes as numbers.

• A random variable is a function that associates a unique numerical value with every outcome of an experiment.

• Random variable is a function X: ΩR.• Ex: toss a coin once: X(H)=1, X(T)=0

Page 30: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Two types of random variable

• Discrete random variable: X takes on only a countable number of distinct values.– Ex: Toss a coin 10 times. X is the number of

tails that are noted.

• Continuous random variable: X takes on uncountable number of possible values.– Ex: X is the lifetime (in hours) of a light bulb.

Page 31: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Probability function

• The probability function of a discrete variable X is a function which gives the probability p(xi) that the random variable equals xi: a.k.a. p(xi) = p(X=xi).

1)(

1)(0

ix

i

i

xp

xp

Page 32: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Random vector

• Random vector is a finite-dimensional vector of random variables: X=[X1,…,Xk].

• P(x) = P(x1,x2,…,xn)=P(X1=x1,…., Xn=xn)

• Ex: P(w1, …, wn, t1, …, tn)

Page 33: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Three types of probability

• Joint prob: P(x,y)= prob of x and y happening together

• Conditional prob: P(x|y) = prob of x given a specific value of y

• Marginal prob: P(x) = prob of x for all possible values of y

Page 34: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Common equations

)(

),()|(

)|(*)()|(*)(),(

),()(

AP

BAPABP

BAPBPABPAPBAP

BAPAPB

Page 35: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

More general cases

),...|(),...,(

),...,()(

111

1

,...,11

2

ii

in

AAn

AAAPAAP

AAPAPn

Page 36: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Information Theory

Page 37: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Information theory

• It is the use of probability theory to quantify and measure “information”.

• Basic concepts:– Entropy– Joint entropy and conditional entropy– Cross entropy and relative entropy– Mutual information and perplexity

Page 38: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Entropy

• Entropy is a measure of the uncertainty associated with a distribution.

• The lower bound on the number of bits it takes to transmit messages.

• An example: – Display the results of horse races. – Goal: minimize the number of bits to encode the results.

x

xpxpXH )(log)()(

Page 39: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

An example

• Uniform distribution: pi=1/8.

• Non-uniform distribution: (1/2,1/4,1/8, 1/16, 1/64, 1/64, 1/64, 1/64)

bitsXH 3)8

1log8

1(*8)( 2

bitsXH 2)64

1log

64

1*4

16

1log

16

1

8

1log8

1

4

1log4

1

2

1log2

1()(

(0, 10, 110, 1110, 111100, 111101, 111110, 111111)

Page 40: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Entropy of a language

• The entropy of a language L:

• If we make certain assumptions that the language is “nice”, then the cross entropy can be calculated as:

n

xpxp

LH nxnn

n

1

)(log)(

lim)(11

n

xp

n

xpLH nn

n

)(log)(loglim)( 11

Page 41: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Joint and conditional entropy

• Joint entropy:

• Conditional entropy:

x y

yxpyxpYXH ),(log),(),(

)(),(

)|(log),()|(

XHYXH

xypyxpXYHx y

Page 42: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Cross Entropy

• Entropy:

• Cross Entropy:

• Cross entropy is a distance measure between p(x) and q(x): p(x) is the true probability; q(x) is our estimate of p(x).

xc

x

xqxpXH

xpxpXH

)(log)()(

)(log)()(

)()( XHXH c

Page 43: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Cross entropy of a language

• The cross entropy of a language L:

• If we make certain assumptions that the language is “nice”, then the cross entropy can be calculated as:

n

xqxp

qLH nxnn

n

1

)(log)(

lim),(11

n

xq

n

xqqLH nn

n

)(log)(loglim),( 11

Page 44: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Relative Entropy

• Also called Kullback-Leibler distance:

• Another distance measure between prob functions p and q.

• KL distance is asymmetric (not a true distance):

)()()(

)(log)()||( 2 XHXH

xq

xpxpqpKL c

),(),( pqKLqpKL

Page 45: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Relative entropy is non-negative

1log0 zzz

0))(())((

)))()((())1)(

)()(((

)(

)(log)(

)(

)(log)(

)||(

xx

xx

xx

xqxp

xpxqxp

xqxp

xp

xqxp

xq

xpxp

qpKL

Page 46: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Mutual information

• It measures how much is in common between X and Y:

• I(X;Y)=KL(p(x,y)||p(x)p(y))

);(

),()()(

)()(

),(log),();(

XYI

YXHYHXH

ypxp

yxpyxpYXI

x y

Page 47: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Perplexity

• Perplexity is 2H.

• Perplexity is the weighted average number of choices a random variable has to make.

Page 48: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Summary

• Course overview

• Problems and methods

• Mathematical foundation– Probability theory– Information theory

M&S Ch2

Page 49: Introduction LING 572 Fei Xia Week 1: 1/3/06. Outline Course overview Problems and methods Mathematical foundation –Probability theory –Information theory

Next time

• FSA

• HMM: M&S Ch 9.1 and 9.2