GOOGLE BRAIN
Rawan Al-Omari and Zeina Al-Helwani
Damas ITE, AI Dept. -2012
ANN Presentation – In Supervision of Dr. Maisa Abo AlKassem
Infant’s Vision
Infant’s Vision
Infant’s Vision
Infant’s Vision
Infant’s Vision
Infant’s Vision
Infant’s Vision
Infant’s Vision
Google X Lab
Google Research Team
Stanford
Andrew Y. Ng and Jeff Dean
Machine Learning - Andrew Y. Ng
*Machine Learning; a branch of artificial intelligence research concerned with developing learning algorithms.
Open Questions!
Open Questions!
Can We simulate these neurons?
Open Questions!
Can We simulate these neurons?
If we think of our neural network as
simulating a very small-scale
“newborn brain”
Can We simulate these neurons?
Open Questions!
If we think of our neural network as
simulating a very small-scale
“newborn brain”
Show it YouTube video for
a week, what will it learn?
Can We simulate these neurons?
Open Questions!
If we think of our neural network as
simulating a very small-scale
“newborn brain”
Show it YouTube video for
a week, what will it learn?
Google Brain
LIKE Human Brain!
Previous Work
Supervised Learning
• It uses Labeled Data!
Labeled Data
Learning Process
Labeled VS Unlabeled
Labeled VS Unlabeled
Cat
Labeled VS Unlabeled
Cat
Labeled VS Unlabeled
Cat Cat
Labeled VS Unlabeled
Cat Cat
The Research
Why Unlabeled Data?!
• Cost
Why Unlabeled Data?!
• Cost
• Available Data
Why Unlabeled Data?!
• Cost
• Available Data
• Malicious Data
Why Unlabeled Data? - Malicious Data
Why Unlabeled Data? - Malicious Data
Guerrilla
Why Unlabeled Data? - Malicious Data
Guerrilla
Why Unlabeled Data? - Malicious Data
Guerrilla Kitkat
Unsupervised Features Learning
Self Taught Learning
Data Set & Test Set
YOUTUBE 10,000,000 images
16,000 CPU Cores
1 Billion Connection
ImageNet 22,000 Categories
16,000,000 images
Training Duration
Training Duration
OVER THREE DAYS !!
Image Features
Features
Pixels Edges
Face Parts
(Combination
of edges)
Face Detectors
High-level Features
High-level Features
High-level Features
Model • Autoenocoders
• Pooling
• Local Contrast
Local Contrast
Local Contrast
BEFOR
Local Contrast
BEFOR
Local Contrast
BEFOR AFTER
Architecture
Architecture
Architecture
Architecture
Architecture L
ayer
1
Image size 200
Architecture L
ayer
1
Image size 200
first sub layer
Architecture L
ayer
1
Image size 200
second sub layer
Architecture L
ayer
1
Image size 200
third sub layer
Architecture L
ayer
1
Image size 200
fourth sub layer
Layer
1
Layer
9
…..
Cats and Faces Detector
Model Parallelism
Model Parallelism
Asynchronous Parallel Model
Larg
e Sc
ale
Largest network to date
Large Scale Human Visual Cortex 106
Experiments
Experiments
Experiments
Google Brain
• 74.8% cat
• 76.7% human body
Experiments
Google Brain
• 74.8% cat
• 76.7% human body
Best linear filters
• 67.2% cat
• 68.1% human body
Experiments
Google Brain
• 74.8% cat
• 76.7% human body
Best linear filters
• 67.2% cat
• 68.1% human body
OpenCV
• 3% of 100,000 samples
9.3% State-of-the-art
9.3% State-of-the-art
15.8% Our method
Dataset version 2009 (∼9M images, ∼10K
categories)
2011 (∼14M images, ∼22K
categories)
State-of-the-art 16.7% 9.3%
Our method 19.2% 15.8%
9.3% State-of-the-art
15.8% Our method
Experiments - Stats
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
80.00%
90.00%
Random guess
Best linear filter
Google Brain
Faces
Human bodies
Cats
Conc l u s i o n
Conclusion!
• Largest network to date !
• Leading to significant advances in area as :
– Machine Vision
– Speech Recognition
– Language Translation
• Google Brain LIKE Human Brain.. it may just be a
matter of Time!
</end>
Thank you!