bil 722: advanced topics in computer vision · bil 722: advanced topics in computer vision...
TRANSCRIPT
BIL 722: Advanced
Topics in Computer
Vision
TensorFlow Tutorial
Mehmet Kerim Yücel
Brunel University London
What is TensorFlow?
Open Source Software Library for Machine Learning-
Google- ’s second generation API for ML after DistBelief
Source code released on - 9/11/2015
Used in various Google Products-
DeepDream, RankBrain, Youtube, Translate,etc...-
Developed by Caffe & Theano- ’s lead contributors and
Google Brain team
29 February 2016
TensorFlow Tutorial 2
Brunel University London
What is TensorFlow?
- Developed in C++ / Python
- Usage: C++ / Python
- Python preferred
- PEP-8 Convention & Google Python Style Guide
- MacOS/Linux , no support for Windows
- Docker install for Windows
- A good lead would be here
https://caffinc.github.io/2015/11/tensorflow-windows/
29 February 2016
TensorFlow Tutorial 3
Brunel University London
Where to begin?29 February 2016
4
- GitHub
https://github.com/tensorflow/tensorflow
- Website
https://www.tensorflow.org/
- Installation
https://www.tensorflow.org/versions/r0.7/get_started/os_setup.html#download-and-setup
- White Paper
http://download.tensorflow.org/paper/whitepaper2015.pdf
TensorFlow Tutorial
Brunel University London
TensorFlow Input Formats29 February 2016
5
Protocol buffers
- Binary Format
- Text Format
An extensive guide on how to “read” data
- Feeding mechanism
- Reading from files
- Preloaded data
https://www.tensorflow.org/versions/r0.7/how_tos/reading_data/index.html#reading-data
TensorFlow Tutorial
Brunel University London
TensorFlow Building Blocks29 February 2016
6
Two major constituents
- Computation Graph (Construction Phase)
- Graph Execution (Execution Phase)
TensorFlow Tutorial
Brunel University London
TensorFlow Building Blocks29 February 2016
7
Computation Graph (Construction Phase)
Graph nodes: operations (source ops & other ops)-
Tensors: representing data (variables, constants)-
Functions: conv- 2d, relu, max_pool, matmul, etc...
Feeding Mechanism-
Using placeholders to symbolically declare tensors-
Feed them during execution-
TensorFlow Tutorial
Brunel University London
TensorFlow Building Blocks29 February 2016
8
Graph Execution (Execution Phase)
- Sessions
- Launch constructed graph (default session, interactive session, etc...)
- Ability to choose devices for certain executions
- Subgraph Execution
- Session parallelization
- Multiple sessions per graph, multiple graphs per session (?)
TensorFlow Tutorial
Brunel University London
TensorFlow – CNN 29 February 2016
9
A simple example
Slightly modified version of official TensorFlow tutorial-
https://www.tensorflow.org/versions/r0.7/tutorials/mnist/pros/index.html
4 - layer CNN with various components
Dropout, max pooling, ReLU, convolution-
Adam Optimizer minimizing cross entropy-
TensorFlow Tutorial
Brunel University London
TensorFlow – CNN (Construction)29 February 2016
10TensorFlow Tutorial
Placeholders for future use
Data reshaping
Convolution and max pooling
wrappers
Variables will be initialized later on
Brunel University London
TensorFlow – CNN (Construction)29 February 2016
11TensorFlow Tutorial
1.Convolutional Layer
2.Convolutional Layer
3.Densely Connected Layer
(with Dropout and Softmax)
4. Cross Entropy minimization
via Adam Optimizer
Brunel University London
TensorFlow – CNN (Execution)29 February 2016
12TensorFlow Tutorial
Session and variable 1.
initialization
Accuracy definition2.
3. Train the network
4. Test the network
Brunel University London
TensorFlow - Visualization29 February 2016
13TensorFlow Tutorial
TensorBoard – Visualization Tool
- Arguably the best monitoring/debugging tool for ML/DL
tools
- Utilises summaries and name scopes
Brunel University London
TensorFlow - Visualization29 February 2016
14TensorFlow Tutorial
Modify the presented model
- Train 1000 times, save values in every 100th iteration
- Namescopes define layers
- Scalar & histogram summaries keep track of values
Brunel University London
TensorFlow - Visualization29 February 2016
15TensorFlow Tutorial
One can change the layer as
they wish
Here, we divide the network
into layers for modular tracking
of weights/biases/loss values
and accuracy
Brunel University London
TensorFlow - Visualization29 February 2016
16TensorFlow Tutorial
Merge summaries
Write the summary of the graph
into given directory
In every 100th iteration, get all
summary values & log them
Launch TensorBoard
Brunel University London
TensorFlow - Visualization29 February 2016
17TensorFlow Tutorial
Brunel University London
TensorFlow - Visualization29 February 2016
18TensorFlow Tutorial
Brunel University London
TensorFlow - Visualization29 February 2016
19TensorFlow Tutorial
Brunel University London
TensorFlow - Visualization29 February 2016
20TensorFlow Tutorial
Brunel University London
TensorFlow –Pros and Cons29 February 2016
21
- No 3d conv (yet)
- Speed/memory problems
- CNN benchmarks for popular imagenet models
https://github.com/soumith/convnet-benchmarks
- Not many models around
- Workaround: convert Caffe models to TensorFlow
https://github.com/ethereon/caffe-tensorflow
- Subgraph execution
TensorFlow Tutorial
Brunel University London
TensorFlow –Pros and Cons29 February 2016
22
- TensorBoard (debug + visualization)
- Maintained by (quite possibly) the best devs around
- Distributed TensorFlow on its way!
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/distributed_runtime/REA
DME.md
- A (fairly) good review of related libraries
https://github.com/zer0n/deepframeworks
https://en.wikipedia.org/wiki/Comparison_of_deep_learning_software
TensorFlow Tutorial
Brunel University London
TensorFlow –References29 February 2016
23
- Official documentation link best source
- Presented code adopted from
https://github.com/tensorflow/tensorflow/blob/r0.7/tensorflow/examples/tutorials/mnist/mnist_w
ith_summaries.py
https://www.tensorflow.org/versions/r0.7/tutorials/mnist/pros/index.html
TensorFlow Tutorial
Brunel University London
Q & A
29 February 2016
24TensorFlow Tutorial
Thank you for listening