cs 188 section 12 - daylen · other classifiers discussed nearest neighbors parametric /...

10
CS 188 SECTION 12 These slides are on Piazza! Search for “Daylen’s slides”

Upload: others

Post on 13-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings

CS 188 SECTION 12These slides are on Piazza! Search for “Daylen’s slides”

Page 2: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings

CORRECTION: LAPLACE SMOOTHINGLaplaceSmoothing

▪ Laplace’sestimate(extended):▪ Pretendyousaweveryoutcomekextratimes

▪ What’sLaplacewithk=0?▪ kisthestrengthoftheprior

▪ Laplaceforconditionals:▪ Smootheachconditionindependently:

r r b

Number of events that X can take on

Page 3: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings

CALCULUS REVIEW SECTIONS

➤ Session 1: today; 6-7:30 pm, Soda 405: single variable calculus

➤ Session 2: today; 7:30-9 pm, Soda 405: identical content to session 1

➤ Session 3: tomorrow; 6-7.30 pm, Soda 380; multi variable calculus

➤ Session 4: tomorrow; 7.30-9 pm, Soda 380; identical content to session 3

Page 4: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings

UPCOMING DEADLINES

➤ Project 5 due today @ 5pm

➤ HW 6 due Wednesday @ 11:59

➤ Project 6 due Sunday @ 5pm

➤ Final Exam next Thursday

Page 5: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings

BINARY PERCEPTRONSLinearClassifiers

▪ Inputsarefeaturevalues▪ Eachfeaturehasaweight▪ Sumistheactivation

▪ Iftheactivationis:▪ Positive,output+1▪ Negative,output-1

Σf1f2f3

w1w2w3

>0?

Page 6: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings

BINARY PERCEPTRONS Learning:BinaryPerceptron

▪ Startwithweights=0▪ Foreachtraininginstance:▪ Classifywithcurrentweights

▪ Ifcorrect(i.e.,y=y*),nochange!▪ Ifwrong:adjusttheweightvectorbyaddingorsubtractingthefeaturevector.Subtractify*is-1.

Page 7: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings

MULTICLASS PERCEPTRONS

MulticlassDecisionRule

▪ Ifwehavemultipleclasses:▪ Aweightvectorforeachclass:

▪ Score(activation)ofaclassy:

▪ Predictionhighestscorewins

Binary=multiclasswherethenegativeclasshasweightzero

Learning:MulticlassPerceptron

▪ Startwithallweights=0▪ Pickuptrainingexamplesonebyone▪ Predictwithcurrentweights

▪ Ifcorrect,nochange!▪ Ifwrong:lowerscoreofwronganswer,

raisescoreofrightanswer

MulticlassDecisionRule

▪ Ifwehavemultipleclasses:▪ Aweightvectorforeachclass:

▪ Score(activation)ofaclassy:

▪ Predictionhighestscorewins

Binary=multiclasswherethenegativeclasshasweightzero

Page 8: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings

OTHER CLASSIFIERS DISCUSSED

➤ Support Vector Machines

SupportVectorMachines

▪ Maximizingthemargin:goodaccordingtointuition,theory,practice▪ Onlysupportvectorsmatter;othertrainingexamplesareignorable▪ Supportvectormachines(SVMs)findtheseparatorwithmaxmargin▪ Basically,SVMsareMIRAwhereyouoptimizeoverallexamplesatonce

MIRA

SVM

SupportVectorMachines

▪ Maximizingthemargin:goodaccordingtointuition,theory,practice▪ Onlysupportvectorsmatter;othertrainingexamplesareignorable▪ Supportvectormachines(SVMs)findtheseparatorwithmaxmargin▪ Basically,SVMsareMIRAwhereyouoptimizeoverallexamplesatonce

MIRA

SVM

Page 9: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings

OTHER CLASSIFIERS DISCUSSED

➤ Nearest Neighbors

Parametric/Non-Parametric

▪ Parametricmodels:▪ Fixedsetofparameters▪ Moredatameansbettersettings

▪ Non-parametricmodels:▪ Complexityoftheclassifierincreaseswithdata▪ Betterinthelimit,oftenworseinthenon-limit

▪ (K)NNisnon-parametric Truth

2Examples 10Examples 100Examples 10000Examples

Page 10: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings

WORKSHEET