cs230 deep learningcs230.stanford.edu/projects_winter_2019/reports/15808060.pdf · 100, dropout of...

6

Upload: others

Post on 30-Jun-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CS230 Deep Learningcs230.stanford.edu/projects_winter_2019/reports/15808060.pdf · 100, dropout of 0.2 and number of epoch 50. We train the model with Adam optimizer of learning rate
Page 2: CS230 Deep Learningcs230.stanford.edu/projects_winter_2019/reports/15808060.pdf · 100, dropout of 0.2 and number of epoch 50. We train the model with Adam optimizer of learning rate
Page 3: CS230 Deep Learningcs230.stanford.edu/projects_winter_2019/reports/15808060.pdf · 100, dropout of 0.2 and number of epoch 50. We train the model with Adam optimizer of learning rate
Page 4: CS230 Deep Learningcs230.stanford.edu/projects_winter_2019/reports/15808060.pdf · 100, dropout of 0.2 and number of epoch 50. We train the model with Adam optimizer of learning rate
Page 5: CS230 Deep Learningcs230.stanford.edu/projects_winter_2019/reports/15808060.pdf · 100, dropout of 0.2 and number of epoch 50. We train the model with Adam optimizer of learning rate
Page 6: CS230 Deep Learningcs230.stanford.edu/projects_winter_2019/reports/15808060.pdf · 100, dropout of 0.2 and number of epoch 50. We train the model with Adam optimizer of learning rate