1 on the learning power of evolution vitaly feldman
Post on 21-Dec-2015
213 views
TRANSCRIPT
![Page 1: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/1.jpg)
1
On The Learning Power of Evolution
Vitaly Feldman
![Page 2: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/2.jpg)
2
Fundamental Question
How can complex and adaptive mechanisms result from evolution? Fundamental principle: random
variation guided by natural selection [Darwin, Wallace 1859]
There is no quantitative theory
TCS Established notions of complexity Computational learning theory
![Page 3: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/3.jpg)
3
Model Outline
Complex behavior: multi-argument function
Function representation Fitness estimation Random variation Natural selection Success criteria
![Page 4: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/4.jpg)
4
Representation
Domain of conditions X and distribution D over X
Representation class R of functions over X Space of available behaviors Efficiently evaluatable
![Page 5: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/5.jpg)
5
Fitness
Optimal function f: X ! {-1,1} Performance: correlation with f relative
to D
Perff(r,D) = ED[f(x)¢r(x)]
![Page 6: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/6.jpg)
6
Random Variation
Mutation algorithm M: Given r 2 R produces a
random mutation of r Efficient NeighM(r) is all possible
outputs of M on r
Hypothesis
Mutation Algorithm
![Page 7: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/7.jpg)
7
Natural SelectionIf beneficial mutations are available then output one of
them, otherwise output one of the neutral mutations
Natural selection
If Bene(r) ; a mutation is chosen from
Bene(r) according to PrM
If Bene(r) = ; a mutation is chosen from Neut(r) according to PrM
* NeighM(r) and Perff are estimated via poly-size
sampling and t is inverse-polynomial
Bene(r)={r’ 2 NeighM(r) | Perff(r’,D) > Perff(r,D) + t }
Neut(r)={r’ 2 NeighM(r) | |Perff(r’,D) - Perff(r,D)| · t }
t is the tolerance
Step(R,M,r)
![Page 8: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/8.jpg)
8
Evolvability Class of functions C is evolvable over D if exists an
evolutionary algorithm (R,M) and a polynomial g(¢,¢) s.t.
For every f2C, r2R, >0, for a sequencer0=r,r1,r2,… where ri+1 Ã Step(R,M,ri) w.h.p.
it holds Perff(rg(n,1/),D) ¸ 1-
Evolvable (distribution-independently) Evolvable for all D by the same R and M
C represents the complexity of structures that can evolve in a single phase of evolution driven by a single optimal function from C
![Page 9: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/9.jpg)
9
Evolvability of Conjunctions ANDs of Boolean variables and their
negations over {-1,1}n
e.g. x3Ƭx5Æx8
Evolutionary algorithm R is all conjunctions M adds or removes a variable or its negation
Does not work Works for monotone conjunctions over the
uniform distribution [L. Valiant 06]
![Page 10: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/10.jpg)
10
What is Evolvable in This Model?
EV µ PAC EV µ SQ ( PAC [L. Valiant 06]
Statistical Query learning [Kearns 93]: estimates of ED[(x,f(x))] for an efficiently evaluatable
EV µ CSQ [F 08]Learnability by correlational statistical queriesCSQ: ED[(x)¢f(x)]
CSQ µ EV [F 08] Fixed D: CSQ = SQ [Bshouty, F 01]
![Page 11: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/11.jpg)
11
Distribution-independent Evolvability Algorithms
Singletons [F 09] R is all conjunctions of a logarithmic number of functions from
a set of pairwise independent functions M chooses a random such conjunction
Lower bounds [F 08] C 2 EV => each function in C is expressible as a “low”
weight integer threshold function over a poly-sized basis B EV ( SQ
Linear threshold functions and decision lists are not evolvable (even weakly) [GHR 92, Sherstov 07, BVW 07]
Conjunctions? Low weight integer linear thresholds?
![Page 12: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/12.jpg)
12
Robustness of the Model
How is the set of evolvable function classes influenced by various aspects of the definition? Selection rule Mutation algorithm Fitness function …
The model is robust to a variety of modifications and the power is essentially determined by the performance function [F 09]
![Page 13: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/13.jpg)
13
Original Selection RuleIf beneficial mutations are available then output one of
them, otherwise output one of the neutral mutations
Natural selection
If Bene(r) ; a mutation is chosen from
Bene(r) according to PrM
If Bene(r) = ; a mutation is chosen from Neut(r) according to PrM
* NeighM(r) and Perff are estimated via poly-size
sampling and t is inverse-polynomial
Bene(r)={r’ 2 NeighM(r) | Perff(r’,D) > Perff(r,D) + t }
Neut(r)={r’ 2 NeighM(r) | |Perff(r’,D) - Perff(r,D)| · t }
t is the tolerance
Step(R,M,r)
![Page 14: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/14.jpg)
14
Other Selection Rules
Sufficient condition:
Selection rule can be “smooth” and need not be fixed in time
8 r1,r2 2 NeighM(r) if
Perff(r1,D) ¸ Perff(r2,D) + t
then r1 is “observably” favored to r2
![Page 15: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/15.jpg)
15
Performance Function
For real-valued representations other measures of performance can be used e.g. expected quadratic loss LQ-Perf :
1-ED[(f(x)-r(x))2]/2 Decision lists are evolvable wrt uniform distribution
with LQ-Perf [Michael 07]
The obtained model is equivalent to learning from the corresponding type of statistical queries CSQ if the loss function is linear SQ otherwise
![Page 16: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/16.jpg)
16
What About New Algorithms?
Conjunctions are evolvable distribution-independently with LQ-Perf [F 09] Mutation algorithm:Add/subtract ¢xi and project to X[-1,1]
(for X = {-1,1}n )
![Page 17: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/17.jpg)
17
Further Directions Limits of distribution-independent
evolvability “Natural” algorithms for “interesting”
function classes and distributions Evolvability without performance
decreases Applications
Direct connections to evolutionary biology
CS
![Page 18: 1 On The Learning Power of Evolution Vitaly Feldman](https://reader030.vdocument.in/reader030/viewer/2022032522/56649d625503460f94a448c5/html5/thumbnails/18.jpg)
18
References
L. Valiant. Evolvability, ECCC 2006; JACM 2009 L. Michael. Evolvability via the Fourier Transform, 2007 V. F. Evolvability from Learning Algorithms, STOC 2008 V. F. and L. Valiant. The Learning Power of Evolution,
COLT 2008(open problems) V. F. Robustness of Evolvability, COLT 2009 (to appear)
V. F. A complete characterization of SQ learning with applications to evolvability, 2009 (to
appear)