Download - Learning Invariances and Hierarchies
![Page 1: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/1.jpg)
Learning Invariances and Hierarchies
Pierre BaldiUniversity of California, Irvine
![Page 2: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/2.jpg)
Two Questions
1. “If we solve computer vision, we have pretty much solved AI.”
2. A-NNs vs B-NNs and Deep Learning.
![Page 3: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/3.jpg)
If we solve computer vision…
![Page 4: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/4.jpg)
If we solve computer vision…
• If we solve computer audition,….
![Page 5: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/5.jpg)
If we solve computer vision…
• If we solve computer audition,….
• If we solve computer olfaction,…
![Page 6: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/6.jpg)
If we solve computer vision…
• If we solve computer audition,….
• If we solve computer olfaction,…
• If we solve computer vision, how can we build computers that can prove Fermat’s last theorem?
![Page 7: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/7.jpg)
Invariances
• Invariances in audition. We can recognize a tune invariantly with respect to: intensity, speed, tonality, harmonization, instrumentation, style, background.
• Invariances in olfaction. We can recognize an odor invariantly with respect to: concentrations, humidity, pressure, winds, mixtures, background.
![Page 8: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/8.jpg)
Non-Invariances
• Invariances evolution did not care about (although we are still evolving!...)
– We cannot recognize faces upside down.– We cannot recognize tunes played in reverse.– We cannot recognize stereoisomers as such.
Enantiomers smell differently.
![Page 9: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/9.jpg)
A-NNs vs B-NNs
![Page 10: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/10.jpg)
Origin of Invariances• Weight sharing and translational invariance.• Can we quantify approximate weight sharing?• Can we use approximate weight sharing to improve
performance?• Some of the invariance comes • from the architecture. • Some may come from the • learning rules.
![Page 11: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/11.jpg)
Learning Invariances
EHebbsymmetric connections
wij=wji
111
11-1
1-11
Acyclic orientation of the Hypercube O(H)
Isometry
Isometry
HebbHebb
O(H)
H
I(O(H))
I(H)
![Page 12: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/12.jpg)
Deep Learning ≈ Deep Targets
Training set: (xi,yi) or i=1, . . ., m
?
![Page 13: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/13.jpg)
Deep Target Algorithms
![Page 14: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/14.jpg)
Deep Target Algorithms
![Page 15: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/15.jpg)
Deep Target Algorithms
![Page 16: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/16.jpg)
Deep Target Algorithms
![Page 17: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/17.jpg)
Deep Target Algorithms
![Page 18: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/18.jpg)
• In spite of the vanishing gradient problem, (and the Newton problem) nothing seems to beat back-propagation.
• Is backpropagation biologically plausible?
![Page 19: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/19.jpg)
Mathematics of Dropout (Cheap Approximation to Training Full Ensemble)
![Page 20: Learning Invariances and Hierarchies](https://reader036.vdocument.in/reader036/viewer/2022081507/56816269550346895dd2d8ed/html5/thumbnails/20.jpg)
Two Questions
1. “If we solve computer vision, we have pretty much solved AI.”
2. A-NNs vs B-NNs and Deep Learning.