mammography with inception

15
Mammography with Inception Stephen Morrell Bob Kemp Karl Kalleberg Zbigniew Wojna

Upload: stephen-morrell

Post on 10-Jan-2017

126 views

Category:

Healthcare


0 download

TRANSCRIPT

Mammography with

Inception

Stephen MorrellBob Kemp

Karl KallebergZbigniew Wojna

The Use Case

•  I believe AI will have a larger impact

than oil.. •  Coming soon: (better) segmentation,

transfer learning, smaller datasets, 3D 4D, unsupervised learning, etc

•  Breast cancer is the second most common in the West after skin cancer and a leading causes of death along with cardiac events &strokes. All have wide diagnostic funnels so suitable for automation.

•  Prevalence: 12% women in the US will develop invasive breast cancer during their lifetime.

•  Low Accuracy in Mammograms

Technology is a gift of God. After the gift of life it is perhaps the greatest of God's gifts. It is the mother of civilizations, of arts and of sciences.

Freeman Dyson

Date of download: 7/12/2016Source: 2016 American Medical Association

Mammography Accuracy

Histology & Micro-calcifications √ Masses X

Krupinski & Levenson, 2015

Pigeons Diagnosing Cancer

Alex Net

Image Net

A Karpathy on Imagenet: 5.1%

http://karpathy.github.io/2014/09/02/what-i-learned-from-competing-against-a-convnet-on-imagenet/

Dream Challenge

•  Dataset 641k mammograms, 87k patients, panel data. Other datasets for pre-training

•  Infrastructure GPUs 2 x NVIDIA K80s Docker

•  Timing Competitive phase toMarch 2017 then community phase

•  Project risks 1.  Infrastructure & Network Development 2.  Do mammograms contain enough info per se. Will they be superceded 3.  Can we develop learning algorithms which classify accurately given limited

data. Dream has 0.5% = 3k labelled cancerous images.

•  Opportunity familiarity with ConvNets & benchmarking. Tech could be applied to other fields

•  $1,000,000 prizes

Dream Challenge - Infrastructure

TensorFlow & Inception

PowerfulLearningFramework1.  GradientDescentandautomated

differentiation2.  Regularisation:Weightdecay,batch

normalisation,dropout3.  Analyticsandvisualisationwith

Tensorboard:a.  Histogramsoflosses,activations,gradients

b.  Networkvisualisation

FastInfrastructure1.  Datapreprocessinginfrastructure2.  Multi-GPUsandmachines3.  Multi-threading4.  Hyperparametertuning

a.  Layer&activationspeciMicationb.  Learningrateschedulesc.  Batchnormalisationd.  etc

5.  etc

# 1 Define Variablesweights = tf.Variable(tf.truncated_normal([IMAGE_PIXELS, hidden1_units], stddev=1.0 / math.sqrt(float(IMAGE_PIXELS))),

name='weights‘)

biases = tf.Variable(tf.zeros([hidden1_units]), name='biases')

# 2 Hidden Layer Definitionhidden1 = tf.nn.relu(tf.matmul(images, weights) + biases)

logits = tf.matmul(hidden1, weights) + biases

# 3 Final Layerscross_entropy = tf.nn.sparse_softmax_cross_entropy_with_logits(

logits, labels, name='xentropy')

loss = tf.reduce_mean(cross_entropy, name='xentropy_mean')

# 4 Add Trainingtrain_op = optimizer.minimize(loss, global_step=global_step)

Graph Construction

# Initialisewith tf.Session() as sess:

init = tf.initialize_all_variables()

sess.run(init)

# Training Loopfor step in xrange(FLAGS.max_steps):

_, loss = sess.run(train_op, loss)

# Evaluation true_count += sess.run(eval_correct, feed_dict=feed_dict)

Init, Train, Eval

GPU, Queue, Combine

•  Benefits of TFRecords •  Starting Queuerunners •  Memory ‘leaks’ •  Variable name clashes •  Dummy & simple networks •  Transfer learning •  Memory use scaling

Infrastructure Lessons

Questions