what is pattern recognition recognizing the fish! 1
Post on 21-Dec-2015
214 views
TRANSCRIPT
![Page 1: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/1.jpg)
What is Pattern Recognition
Recognizing the fish!
1
![Page 2: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/2.jpg)
WHAT İS PATTERN
• Structures regulated by rules• Goal:Represent empirical knowledge in
mathematical forms• the Mathematics of Perception• Need: Algebra, probability theory, graph
theory
![Page 3: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/3.jpg)
All flows! Heraclitos
• It is only the invariance, the permanent facts, that enable us to find the meaning in a world of flux.
• We can only perceive variances• Our aim is to find the invariant laws of our
varying obserbvations
Pattern Recognition
![Page 4: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/4.jpg)
Learning vs Recognition Learning: Find the Mathematical Model of a
class, using data Recognition: Use that model to recognize
the unknown object
4
![Page 5: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/5.jpg)
Two Schools of Thought
5
• Statistical Pattern Recognition:
Express the image class by a random variable corresponding to class features and model the class by finding the probability density fonction
• Structural Pattern Recognition
Express the image class by a set of picture primitives. Model the class by finding the relationship among the primitives using grammars or a graphs
![Page 6: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/6.jpg)
1. STATISTICAL PR: ASSUMPTİON
SOURCE:HypothesisClassesObljects
CHANNEL:Noisy
OBSERVATION:Multiple sensorVariations
![Page 7: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/7.jpg)
Probability TheoryApples and Oranges
![Page 8: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/8.jpg)
Probability Theory
Marginal Probability
Conditional ProbabilityJoint Probability
Red Urn:x1C1 samples
Blue urn:x2 c2 samples
Oranges: y1 6 1
Apples: y2 2 1
![Page 9: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/9.jpg)
Probability Theory
Sum Rule
Product Rule
![Page 10: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/10.jpg)
The Rules of Probability
Sum Rule
Product Rule
![Page 11: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/11.jpg)
Bayes’ Theorem
posterior likelihood × prior
![Page 12: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/12.jpg)
Probability Densities
![Page 13: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/13.jpg)
Expectations
Conditional Expectation(discrete)Approximate Expectation(discrete and continuous)
![Page 14: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/14.jpg)
Variances and Covariances
![Page 15: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/15.jpg)
1. BASİC CONCEPTS
15
• A class is a set of objects having some important properties in common
• A feature extractor is a program that inputs the data (image) and extracts features that can be used in classification.
• A classifier is a program that inputs the feature vector and assigns it to one of a set of designated classes or to the “reject” class.
With what kinds of classes do you work?
![Page 16: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/16.jpg)
Feature Vector Representation X=[x1, x2, … , xn], each
xj a real number xj may be an object
measurement xj may be count of
object parts Example: object rep.
[#holes, #strokes, moments, …]
16
![Page 17: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/17.jpg)
Possible Features for char rec.
17
![Page 18: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/18.jpg)
Modeling the source
Functions f(x, K) perform some computation on feature vector x
Knowledge K from training or programming is used
Final stage determines class
18
![Page 19: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/19.jpg)
Two Schools of Thought
19
• Statistical Pattern Recognition:
Express the image class by a random variable corresponding to class features and model the class by finding the probability density fonction
• Structural Pattern Recognition
Express the image class by a set of picture primitives. Model the class by finding the relationship among the primitives using grammars or a graphs
![Page 20: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/20.jpg)
Classifiers often used in CV
20
• Nearest Mean Classifier •Nearest neighbor classifier• Bayesian Classifiers •----------------------------------•Decision Tree Classifiers• Artificial Neural Net Classifiers• Support Vector Machines
![Page 21: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/21.jpg)
1. Classification using nearest class mean
Compute the Euclidean distance between feature vector X and the mean of each class.
Choose closest class, if close enough (reject otherwise)
21
![Page 22: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/22.jpg)
Nearest mean might yield poor results with complex structure
Class 2 has two modes; where is
its mean?
But if modes are detected, two subclass mean vectors can be used
22
![Page 23: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/23.jpg)
Scaling coordinates by Covariance Matrix Define Mahalanobis Distance between two
vectors x-xc = [ x-xc]T C-1 [x-xc] Where covariance matrix
C = 1/N { (xi – xc)T (xi – xc)}
23
![Page 24: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/24.jpg)
2. Nearest Neighbor Classification– The k – nearest-neighbor rule
• Goal: Classify x by assigning it the label most frequently represented among the k nearest samples and use a voting scheme
![Page 25: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/25.jpg)
Pattern Classification, Chapter 4 (Part 2) 25
![Page 26: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/26.jpg)
Pattern Classification, Chapter 4 (Part 2)
26
![Page 27: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/27.jpg)
Bayes Decision Making Thomas Bayes 1701--1761
27
Goal:
Given the training data computemax P(wi/x) i
![Page 28: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/28.jpg)
Bayesian decision-making
28
![Page 29: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/29.jpg)
Bayes
29
![Page 30: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/30.jpg)
Normal distribution
0 mean and unit std deviation
Table enables us to fit histograms and represent them simply
New observation of variable x can then be translated into probability
Stockman CSE803 Fall 2008 30
![Page 31: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/31.jpg)
Parametric Models can be used
Stockman CSE803 Fall 2008 31
![Page 32: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/32.jpg)
Cherry with bruise Intensities at about 750 nanometers wavelength Some overlap caused by cherry surface turning away
Stockman CSE803 Fall 2008 32
![Page 33: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/33.jpg)
Information Theory:Claude Shannon 1916-2001
Goal: Find the amount of information carried by a specific value of a r.v.
Need something intuitive.
![Page 34: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/34.jpg)
Information Theory: C. Shannon Information: giving form or shape to the mind Assumptions: Source Receiver
• Information is the quality of a message • it may be a truth or a lie, • if the amount of information in the received
message increases, the message is more accurate.• Need a common alphabet to communucate
message
![Page 35: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/35.jpg)
Quantification of information. Given r.v. X and p(x) , what is the amount of information when we receive
an outcome of x? Self Information
h(x)= -log p (x)
Low probability Surprise High info Base e: nats Base 2: bits
![Page 36: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/36.jpg)
Entropy: Average of self information needed to specify the state of a random variable.
36
Given a set of training vectors S, if there are c classes,
Entropy(S) = -pi log (pi)
Where pi is the proportion of category i examples in S.
i=1
c
2
If all examples belong to the same category, the entropy is 0.
If the examples are equally mixed (1/c examples of eachclass), the entropy is a maximum at 1.0.
e.g. for c=2, -.5 log .5 - .5 log .5 = -.5(-1) -.5(-1) = 12 2
![Page 37: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/37.jpg)
Entropy:
Why does entropy measures information?•İt makes sense intuitively•“Nobody knows what entropy really is, so in any discussion you will always have an advan tage". Von Neumann
![Page 38: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/38.jpg)
Entropy
![Page 39: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/39.jpg)
Entropy
![Page 40: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/40.jpg)
2. Structural Techniques:Goal: Represent the classes by graphs
40
![Page 41: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/41.jpg)
Training the system: Given the training data
Step 1: Process the image– Remove noise– binarize– Find skeleton– Normalize the size
Step 2: Extract features – Side, lake, bay
Step 3: Represent the character by a graph41
![Page 42: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/42.jpg)
3. Represent the character by a graph:G(N, A)
42
![Page 43: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/43.jpg)
Training: represent each class by a graph Recognition: Use graph similarities to assign
a label
43
![Page 44: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/44.jpg)
Decision Trees:
44
#holes
L-W ratio#strokes #strokes
best axisdirection
#strokes
- / 1 x w 0 A 8 B
01
2
< t t
2 4
0 1
060
90
0 1
![Page 45: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/45.jpg)
Binary decision tree
45
![Page 46: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/46.jpg)
Entropy-Based Automatic Decision Tree Construction
46
Node 1What feature
should be used?
What values?
Training Set S x1=(f11,f12,…f1m) x2=(f21,f22, f2m) . . xn=(fn1,f22, f2m)
Quinlan suggested information gain in his ID3 systemand later the gain ratio, both based on entropy.
![Page 47: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/47.jpg)
Information Gain
47
The information gain of an attribute A is the expectedreduction in entropy caused by partitioning on this attribute.
Gain(S,A) = Entropy(S) - ----- Entropy(Sv)v Values(A)
|Sv|
|S|
where Sv is the subset of S for which attribute A hasvalue v.
Choose the attribute A that gives the maximuminformation gain.
![Page 48: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/48.jpg)
Information Gain (cont)
48
Attribute A
v1 vkv2
Set S
Set S
repeatrecursively
Information gain has the disadvantage that it prefersattributes with large number of values that split thedata into small, pure subsets.
S={sS | value(A)=v1}
![Page 49: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/49.jpg)
Gain Ratio
49
Gain ratio is an alternative metric from Quinlan’s 1986paper and used in the popular C4.5 package (free!).
GainRatio(S,A) = ------------------Gain(S,a)
SplitInfo(S,A)
SplitInfo(S,A) = - ----- log ------ |Si|
|S|
|Si|
|S|
where Si is the subset of S in which attribute A has its ith value.
2i=1
ni
SplitInfo measures the amount of information providedby an attribute that is not specific to the category.
![Page 50: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/50.jpg)
Information Content
50
Note:
A related method of decision tree construction usinga measure called Information Content is given in thetext, with full numeric example of its use.
![Page 51: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/51.jpg)
Artificial Neural Nets
51
Artificial Neural Nets (ANNs) are networks ofartificial neuron nodes, each of which computesa simple function.
An ANN has an input layer, an output layer, and“hidden” layers of nodes.
.
.
.
.
.
.
Inputs
Outputs
![Page 52: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/52.jpg)
Node Functions
52
x1x2
xj
xn
output
output = g ( xj * w(j,i) )
Function g is commonly a step function, sign function,or sigmoid function (see text).
neuron iw(1,i)
w(j,i)
![Page 53: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/53.jpg)
53
![Page 54: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/54.jpg)
54
![Page 55: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/55.jpg)
Neural Net Learning
55
That’s beyond the scope of this text; onlysimple feed-forward learning is covered.
The most common method is called back propagation.
There are software packages available.
What do you use?
![Page 56: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/56.jpg)
How to Measure PERFORMANCE Classification rate: Count the number of correctly classified
sample in wi , ci, find the frequencey
classification rate=ci/N
56
![Page 57: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/57.jpg)
2-Class problem
Estimated Class 1 (var) Estimated Class 1 (yok)
True Class 1 (var) True hit, Correct Detection
False dismissal, false negative
True Class 2 (yok) False positiveFalse alarm,
True dismissal
57
![Page 58: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/58.jpg)
Receiver Operating Curve ROC
Plots correct detection rate versus false alarm rate
Generally, false alarms go up with attempts to detect higher percentages of known objects
58
![Page 59: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/59.jpg)
Presicion- Recall in CBIR or DR
59
![Page 60: What is Pattern Recognition Recognizing the fish! 1](https://reader030.vdocument.in/reader030/viewer/2022032710/56649d645503460f94a46fa4/html5/thumbnails/60.jpg)
Confusion matrix shows empirical performance
60