Tensorflow: second book
A while back, I posted on my interest in Tensorflow, and planned a few brief follow-up impressions of some books I bought. This was soon followed by a first posting on "Tensorflow for Deep Learning" with a follow-up on a CNN for a digit recognition example from that book. After that, the distractions of life and work took over, but now I am ready to continue these postings. Since then, I even bought a few new books!
I am really enthusiastic about the second book, "Learning TensorFlow" by Tom Hope, Yehezkel S. Resheff, and Itay Lieder. Unlike the first book, which was informative but a bit sparse on details, this book explains the details of constructing and running a TensorFlow computation graph really well. After the first three chapters, I was quite comfortable with the fundamental basic blocks of TensorFlow and experimenting with some different setups. What was particularly helpful was showing a toy graph first to explain underlying principles before moving on to more realistic and elaborate examples.
All the examples in the book can be downloaded from the accompanying website, which makes running and modifying the examples while reading about them very easy. For example, here is the tensor output of a simple softmax regression for digit classification on the MNIST dataset for a single digit. The position of the maximum value corresponds to the recognized digit, 9 in this case.
[-2.2436407 -8.0659075 -1.4967757 -0.9444653 3.0563385
-0.29942074 -1.7668717 2.8794923 2.2567298 6.6245046 ] -> [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
Later chapters go into much more detail on CNN (convolutional neural networks), enhancing digit recognition and classifying the CIFAR10 dataset, RNN (recurrent neural networks), as well as some important extensions and enhancements of TensorFlow. There is even a full chapter on deployment, or "inference", of a neural network after training, which of course is the ultimate objective of machine learning.
As the title already says, all in all I can warmly recommend this book for learning TensorFlow!
[-2.2436407 -8.0659075 -1.4967757 -0.9444653 3.0563385
-0.29942074 -1.7668717 2.8794923 2.2567298 6.6245046 ] -> [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
Later chapters go into much more detail on CNN (convolutional neural networks), enhancing digit recognition and classifying the CIFAR10 dataset, RNN (recurrent neural networks), as well as some important extensions and enhancements of TensorFlow. There is even a full chapter on deployment, or "inference", of a neural network after training, which of course is the ultimate objective of machine learning.
As the title already says, all in all I can warmly recommend this book for learning TensorFlow!
CIFAR10 samples |
Comments