cse 5331/7331 f'07© prentice hall1 cse 5331/7331 fall 2007 machine learning margaret h. dunham...

42
CSE 5331/7331 F' 07 © Prentice Hall 1 CSE 5331/7331 CSE 5331/7331 Fall 2007 Fall 2007 Machine Learning Machine Learning Margaret H. Dunham Margaret H. Dunham Department of Computer Science and Engineering Department of Computer Science and Engineering Southern Methodist University Southern Methodist University Some Slides extracted from Some Slides extracted from Data Mining, Introductory and Advanced Topics Data Mining, Introductory and Advanced Topics , Prentice Hall, , Prentice Hall, 2002. 2002. Other slides from CS 545 at Colorado State University, Chuck Anderson Other slides from CS 545 at Colorado State University, Chuck Anderson

Upload: abigayle-wilson

Post on 17-Jan-2016

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

CSE 5331/7331 F'07 © Prentice Hall 1

CSE 5331/7331CSE 5331/7331Fall 2007Fall 2007

Machine LearningMachine Learning

Margaret H. DunhamMargaret H. DunhamDepartment of Computer Science and EngineeringDepartment of Computer Science and Engineering

Southern Methodist UniversitySouthern Methodist University

Some Slides extracted from Some Slides extracted from Data Mining, Introductory and Advanced TopicsData Mining, Introductory and Advanced Topics , Prentice Hall, 2002., Prentice Hall, 2002.Other slides from CS 545 at Colorado State University, Chuck AndersonOther slides from CS 545 at Colorado State University, Chuck Anderson

Page 2: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

2CSE 5331/7331 F'07

Table of ContentsTable of Contents

Introduction (Chuck Anderson)Introduction (Chuck Anderson) Statistical Machine Learning ExamplesStatistical Machine Learning Examples

– EstimationEstimation– EMEM– Bayes TheoremBayes Theorem

Decision Tree LearningDecision Tree Learning Neural Network LearningNeural Network Learning

Page 3: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

3CSE 5331/7331 F'07

The slides in this introductory section are The slides in this introductory section are from from

CS545: Machine LearningCS545: Machine Learning

By Chuck AndersonBy Chuck Anderson

Department of Computer ScienceDepartment of Computer Science

Colorado State UniversityColorado State University

Fall 2006Fall 2006

Page 4: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

4CSE 5331/7331 F'07

What is Machine Learning?

Statistics ≈ the science of inference from data Machine learning ≈ multivariate statistics +

computational statistics Multivariate statistics ≈ prediction of values of

a function assumed to underlie a multivariate dataset

Computational statistics ≈ computational methods for statistical problems (aka statistical computation) + statistical methods which happen to be computationally intensive

Data Mining ≈ exploratory data analysis, particularly with massive/complex datasets

Page 5: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

5CSE 5331/7331 F'07

Kinds of Learning Learning algorithms are often categorized

according to the amount of information provided:

Least Information:– Unsupervised learning is more exploratory.– Requires samples of inputs. Must find regularities.

More Information:– Reinforcement learning most recent.– Requires samples of inputs, actions, and rewards

or punishments. Most Information:

– Supervised learning is most common.– Requires samples of inputs and desired outputs.

Page 6: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

6CSE 5331/7331 F'07

Examples of Algorithms

Supervised learning– Regression

» multivariate regression» neural networks and kernel methods

– Classification» linear and quadratic discrimination analysis» k-nearest neighbors» neural networks and kernel methods

Reinforcement learning– multivariate regression– neural networks

Unsupervised learning– principal components analysis– k-means clustering– self-organizing networks

Page 7: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

7CSE 5331/7331 F'07

Page 8: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

8CSE 5331/7331 F'07

Page 9: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

9CSE 5331/7331 F'07

Page 10: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

10CSE 5331/7331 F'07

Page 11: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

11CSE 5331/7331 F'07

Table of ContentsTable of Contents

Introduction (Chuck Anderson)Introduction (Chuck Anderson) Statistical Machine Learning ExamplesStatistical Machine Learning Examples

– EstimationEstimation– EMEM– Bayes TheoremBayes Theorem

Decision Tree LearningDecision Tree Learning Neural Network LearningNeural Network Learning

Page 12: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 12CSE 5331/7331 F'07

Point EstimationPoint Estimation Point Estimate:Point Estimate: estimate a population estimate a population

parameter.parameter. May be made by calculating the parameter for a May be made by calculating the parameter for a

sample.sample. May be used to predict value for missing data.May be used to predict value for missing data. Ex: Ex:

– R contains 100 employeesR contains 100 employees– 99 have salary information99 have salary information– Mean salary of these is $50,000Mean salary of these is $50,000– Use $50,000 as value of remaining employee’s Use $50,000 as value of remaining employee’s

salary. salary. Is this a good idea?Is this a good idea?

Page 13: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 13CSE 5331/7331 F'07

Estimation ErrorEstimation Error

Bias: Bias: Difference between expected value and Difference between expected value and actual value.actual value.

Mean Squared Error (MSE):Mean Squared Error (MSE): expected value expected value of the squared difference between the of the squared difference between the estimate and the actual value:estimate and the actual value:

Why square?Why square? Root Mean Square Error (RMSE)Root Mean Square Error (RMSE)

Page 14: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 14CSE 5331/7331 F'07

Jackknife EstimateJackknife Estimate Jackknife Estimate:Jackknife Estimate: estimate of parameter is estimate of parameter is

obtained by omitting one value from the set of obtained by omitting one value from the set of observed values.observed values.

Ex: estimate of mean for X={xEx: estimate of mean for X={x1, … , x, … , xn}}

Page 15: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 15CSE 5331/7331 F'07

Maximum Likelihood Maximum Likelihood Estimate (MLE)Estimate (MLE)

Obtain parameter estimates that maximize Obtain parameter estimates that maximize the probability that the sample data occurs for the probability that the sample data occurs for the specific model.the specific model.

Joint probability for observing the sample Joint probability for observing the sample data by multiplying the individual probabilities. data by multiplying the individual probabilities. Likelihood function: Likelihood function:

Maximize L.Maximize L.

Page 16: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 16CSE 5331/7331 F'07

MLE ExampleMLE Example

Coin toss five times: {H,H,H,H,T}Coin toss five times: {H,H,H,H,T}

Assuming a perfect coin with H and T equally Assuming a perfect coin with H and T equally

likely, the likelihood of this sequence is: likely, the likelihood of this sequence is:

However if the probability of a H is 0.8 then:However if the probability of a H is 0.8 then:

Page 17: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 17CSE 5331/7331 F'07

MLE Example (cont’d)MLE Example (cont’d) General likelihood formula:General likelihood formula:

Estimate for p is then 4/5 = 0.8Estimate for p is then 4/5 = 0.8

Page 18: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 18CSE 5331/7331 F'07

Expectation-Maximization Expectation-Maximization (EM)(EM)

Solves estimation with incomplete data.Solves estimation with incomplete data. Obtain initial estimates for parameters.Obtain initial estimates for parameters. Iteratively use estimates for missing Iteratively use estimates for missing

data and continue until convergence.data and continue until convergence.

Page 19: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 19CSE 5331/7331 F'07

EM ExampleEM Example

Page 20: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 20CSE 5331/7331 F'07

EM AlgorithmEM Algorithm

Page 21: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 21CSE 5331/7331 F'07

Bayes TheoremBayes Theorem

Posterior Probability:Posterior Probability: P(hP(h1|x|xi)) Prior Probability:Prior Probability: P(h P(h1)) Bayes Theorem:Bayes Theorem:

Assign probabilities of hypotheses given a data Assign probabilities of hypotheses given a data value.value.

Page 22: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 22CSE 5331/7331 F'07

Bayes Theorem ExampleBayes Theorem Example Credit authorizations (hypotheses): Credit authorizations (hypotheses):

hh11=authorize purchase, h=authorize purchase, h2 = authorize after = authorize after further identification, hfurther identification, h3=do not authorize, =do not authorize, hh4= do not authorize but contact police= do not authorize but contact police

Assign twelve data values for all Assign twelve data values for all combinations of credit and income:combinations of credit and income:

From training data: P(hFrom training data: P(h11) = 60%; P(h) = 60%; P(h22)=20%; )=20%;

P(h P(h33)=10%; P(h)=10%; P(h44)=10%.)=10%.

1 2 3 4 Excellent x1 x2 x3 x4 Good x5 x6 x7 x8 Bad x9 x10 x11 x12

Page 23: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 23CSE 5331/7331 F'07

Bayes Example(cont’d)Bayes Example(cont’d) Training Data:Training Data:

ID Income Credit Class xi 1 4 Excellent h1 x4 2 3 Good h1 x7 3 2 Excellent h1 x2 4 3 Good h1 x7 5 4 Good h1 x8 6 2 Excellent h1 x2 7 3 Bad h2 x11 8 2 Bad h2 x10 9 3 Bad h3 x11 10 1 Bad h4 x9

Page 24: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 24CSE 5331/7331 F'07

Bayes Example(cont’d)Bayes Example(cont’d) Calculate P(xCalculate P(xii|h|hjj) and P(x) and P(xii))

Ex: P(xEx: P(x77|h|h11)=2/6; P(x)=2/6; P(x44|h|h11)=1/6; P(x)=1/6; P(x22|h|h11)=2/6; P(x)=2/6; P(x88||

hh11)=1/6; P(x)=1/6; P(xii|h|h11)=0 for all other x)=0 for all other xii.. Predict the class for xPredict the class for x44::

– Calculate P(hCalculate P(hjj|x|x44) for all h) for all hjj. . – Place xPlace x4 4 in class with largest value.in class with largest value.– Ex: Ex:

»P(hP(h11|x|x44)=(P(x)=(P(x44|h|h11)(P(h)(P(h11))/P(x))/P(x44)) =(1/6)(0.6)/0.1=1. =(1/6)(0.6)/0.1=1.

»xx4 4 in class hin class h11..

Page 25: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

25CSE 5331/7331 F'07

Table of ContentsTable of Contents

Introduction (Chuck Anderson)Introduction (Chuck Anderson) Statistical Machine Learning ExamplesStatistical Machine Learning Examples

– EstimationEstimation– EMEM– Bayes TheoremBayes Theorem

Decision Tree LearningDecision Tree Learning Neural Network LearningNeural Network Learning

Page 26: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 26CSE 5331/7331 F'07

Twenty Questions GameTwenty Questions Game

Page 27: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 27CSE 5331/7331 F'07

Decision TreesDecision Trees Decision Tree (DT):Decision Tree (DT):

– Tree where the root and each internal node is Tree where the root and each internal node is labeled with a question. labeled with a question.

– The arcs represent each possible answer to The arcs represent each possible answer to the associated question. the associated question.

– Each leaf node represents a prediction of a Each leaf node represents a prediction of a solution to the problem.solution to the problem.

Popular technique for classification; Leaf Popular technique for classification; Leaf node indicates class to which the node indicates class to which the corresponding tuple belongs.corresponding tuple belongs.

Page 28: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 28CSE 5331/7331 F'07

Decision Tree ExampleDecision Tree Example

Page 29: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 29CSE 5331/7331 F'07

Decision TreesDecision Trees

How do you build a good DT?How do you build a good DT? What is a good DT?What is a good DT? Ans: Supervised LearningAns: Supervised Learning

Page 30: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 30CSE 5331/7331 F'07

Comparing DTsComparing DTs

BalancedDeep

Page 31: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 31CSE 5331/7331 F'07

Decision Tree Induction is often based on Decision Tree Induction is often based on Information TheoryInformation Theory

SoSo

Page 32: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 32CSE 5331/7331 F'07

InformationInformation

Page 33: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 33CSE 5331/7331 F'07

DT Induction DT Induction

When all the marbles in the bowl are When all the marbles in the bowl are mixed up, little information is given. mixed up, little information is given.

When the marbles in the bowl are all When the marbles in the bowl are all from one class and those in the other from one class and those in the other two classes are on either side, more two classes are on either side, more information is given.information is given.

Use this approach with DT Induction !Use this approach with DT Induction !

Page 34: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 34CSE 5331/7331 F'07

Information/EntropyInformation/Entropy Given probabilitites pGiven probabilitites p11, p, p22, .., p, .., pss whose sum is whose sum is

1, 1, EntropyEntropy is defined as:is defined as:

Entropy measures the amount of randomness Entropy measures the amount of randomness or surprise or uncertainty.or surprise or uncertainty.

Goal in classificationGoal in classification– no surpriseno surprise– entropy = 0entropy = 0

Page 35: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

35CSE 5331/7331 F'07

Table of ContentsTable of Contents

Introduction (Chuck Anderson)Introduction (Chuck Anderson) Statistical Machine Learning ExamplesStatistical Machine Learning Examples

– EstimationEstimation– EMEM– Bayes TheoremBayes Theorem

Decision Tree LearningDecision Tree Learning Neural Network LearningNeural Network Learning

Page 36: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 36CSE 5331/7331 F'07

Neural Networks Neural Networks Based on observed functioning of human Based on observed functioning of human

brain. brain. (Artificial Neural Networks (ANN)(Artificial Neural Networks (ANN) Our view of neural networks is very simplistic. Our view of neural networks is very simplistic. We view a neural network (NN) from a We view a neural network (NN) from a

graphical viewpoint.graphical viewpoint. Alternatively, a NN may be viewed from the Alternatively, a NN may be viewed from the

perspective of matrices.perspective of matrices. Used in pattern recognition, speech Used in pattern recognition, speech

recognition, computer vision, and recognition, computer vision, and classification.classification.

Page 37: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 37CSE 5331/7331 F'07

Neural NetworksNeural Networks Neural Network (NN)Neural Network (NN) is a directed graph is a directed graph

F=<V,A> with vertices V={1,2,…,n} and arcs F=<V,A> with vertices V={1,2,…,n} and arcs A={<i,j>|1<=i,j<=n}, with the following A={<i,j>|1<=i,j<=n}, with the following restrictions:restrictions:– V is partitioned into a set of input nodes, VV is partitioned into a set of input nodes, V II, ,

hidden nodes, Vhidden nodes, VHH, and output nodes, V, and output nodes, VOO..– The vertices are also partitioned into layers The vertices are also partitioned into layers – Any arc <i,j> must have node i in layer h-1 Any arc <i,j> must have node i in layer h-1

and node j in layer h.and node j in layer h.– Arc <i,j> is labeled with a numeric value wArc <i,j> is labeled with a numeric value w ijij..– Node i is labeled with a function fNode i is labeled with a function f ii..

Page 38: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 38CSE 5331/7331 F'07

Neural Network ExampleNeural Network Example

Page 39: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 39CSE 5331/7331 F'07

NN NodeNN Node

Page 40: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 40CSE 5331/7331 F'07

NN Activation FunctionsNN Activation Functions

Functions associated with nodes in Functions associated with nodes in graph.graph.

Output may be in range [-1,1] or [0,1]Output may be in range [-1,1] or [0,1]

Page 41: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 41CSE 5331/7331 F'07

NN Activation FunctionsNN Activation Functions

Page 42: CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern

© Prentice Hall 42CSE 5331/7331 F'07

NN LearningNN Learning

Propagate input values through graph.Propagate input values through graph. Compare output to desired output.Compare output to desired output. Adjust weights in graph accordingly.Adjust weights in graph accordingly.