Deep Learning with TensorFlow : Explore neural networks and build intelligent systems with Python, 2nd Edition.
Compliant with TensorFlow 1.7, this book introduces the core concepts of deep learning. Get implementation and research details on cutting-edge architectures and apply advanced concepts to your own projects. Develop your knowledge of deep neural networks through hands-on model building and examples...
Saved in:
Online Access: |
Full text (MCPHS users only) |
---|---|
Main Author: | |
Other Authors: | |
Format: | Electronic eBook |
Language: | English |
Published: |
Birmingham :
Packt Publishing,
2018
|
Edition: | 2nd ed. |
Subjects: | |
Local Note: | ProQuest Ebook Central |
Table of Contents:
- Cover; Copyright; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Getting Started with Deep Learning; A soft introduction to machine learning; Supervised learning; Unbalanced data; Unsupervised learning; Reinforcement learning; What is deep learning?; Artificial neural networks; The biological neurons; The artificial neuron; How does an ANN learn?; ANNs and the backpropagation algorithm; Weight optimization; Stochastic gradient descent; Neural network architectures; Deep Neural Networks (DNNs); Multilayer perceptron; Deep Belief Networks (DBNs).
- Convolutional Neural Networks (CNNs)AutoEncoders; Recurrent Neural Networks (RNNs); Emergent architectures; Deep learning frameworks; Summary; Chapter 2: A First Look at TensorFlow; A general overview of TensorFlow; What's new in TensorFlow v1.6?; Nvidia GPU support optimized; Introducing TensorFlow Lite; Eager execution; Optimized Accelerated Linear Algebra (XLA); Installing and configuring TensorFlow; TensorFlow computational graph; TensorFlow code structure; Eager execution with TensorFlow; Data model in TensorFlow; Tensor; Rank and shape; Data type; Variables; Fetches.
- Feeds and placeholdersVisualizing computations through TensorBoard; How does TensorBoard work?; Linear regression and beyond; Linear regression revisited for a real dataset; Summary; Chapter 3: Feed-Forward Neural Networks with TensorFlow; Feed-forward neural networks (FFNNs); Feed-forward and backpropagation; Weights and biases; Activation functions; Using sigmoid; Using tanh; Using ReLU; Using softmax; Implementing a feed-forward neural network; Exploring the MNIST dataset; Softmax classifier; Implementing a multilayer perceptron (MLP); Training an MLP; Using MLPs; Dataset description.
- PreprocessingA TensorFlow implementation of MLP for client-subscription assessment; Deep Belief Networks (DBNs); Restricted Boltzmann Machines (RBMs); Construction of a simple DBN; Unsupervised pre-training; Supervised fine-tuning; Implementing a DBN with TensorFlow for client-subscription assessment; Tuning hyperparameters and advanced FFNNs; Tuning FFNN hyperparameters; Number of hidden layers; Number of neurons per hidden layer; Weight and biases initialization; Selecting the most suitable optimizer; GridSearch and randomized search for hyperparameters tuning; Regularization.
- Dropout optimizationSummary; Chapter 4: Convolutional Neural Networks; Main concepts of CNNs; CNNs in action; LeNet5; Implementing a LeNet-5 step by step; AlexNet; Transfer learning; Pretrained AlexNet; Dataset preparation; Fine-tuning implementation; VGG; Artistic style learning with VGG-19; Input images; Content extractor and loss; Style extractor and loss; Merger and total loss; Training; Inception-v3; Exploring Inception with TensorFlow; Emotion recognition with CNNs; Testing the model on your own image; Source code; Summary; Chapter 5: Optimizing TensorFlow Autoencoders.