https://storage.googleapis.com/amy-jo/talks/tf-workshop.pdf
https://home.zhaw.ch/~dueo/bbs/files/TF_Introduction.pdf
Basic CNN Knowledge: https://courses.cognitiveclass.ai/courses/course-v1:BigDataUniversity+ML0115EN+2016/courseware/407a9f86565c44189740699636b4fb85/980c8dd6f14744a88c4731e54bad4b5b/
Best PPT to understand MNIST: https://scottylabs.org/crashcourse/assets/tensorflow/deep-learning-with-tensorflow.pdf
Google prepsentation of Tensorflow: https://www.matroid.com/scaledml/slides/jeff.pdf
http://web.donga.ac.kr/yjko/usefulthings/TensorFlow-Basic-Concept_Ko.pdf
Paper on Dropout in Deep learning: https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf
http://cs231n.github.io/convolutional-networks/
http://www2.ift.ulaval.ca/~chaib/IFT-4102-7025/public_html/Fichiers/Machine_Learning_in_Action.pdf
https://arxiv.org/pdf/1610.01178.pdf
Tensorflow white paper by google: http://download.tensorflow.org/paper/whitepaper2015.pdf
We can save our model, i.e. the weights and biases of the final generation in this API and use it later for predicting new data, which I found interesting.