1. INTRODUCTION TO DEEP LEARNING AND NEURAL NETWORKS
1.1 Field of Machine Learning and its impact on the field of Artificial Intelligence
1.2 The benefits of Machine Learning w.r.t. traditional methodologies
1.3 Deep Learning introduction and how it is different from all other Machine Learning methods
1.4 Classification and regression in supervised learning
1.5 Clustering and association in unsupervised learning and the algorithms that are used in these categories
1.6 Introduction to AI and neural networks
1.7 Machine Learning concepts
1.8 Supervised learning with neural networks
1.9 Fundamentals of statistics, hypothesis testing, probability distributions, and hidden Markov models
2. MULTI-LAYERED NEURAL NETWORKS
2.1 Multi-layered networks introduction, regularization, and deep neural networks
2.2 Multi-layer perceptron
2.3 Overfitting and capacity
2.4 Neural network hyperparameters and logic gates
2.5 Different activation functions used in neural networks, including ReLU, Softmax, Sigmoid, and hyperbolic functions
2.6 Backpropagation, forward propagation, convergence, hyperparameters, and overfitting
3. ARTIFICIAL NEURAL NETWORKS AND VARIOUS METHODS
3.1 Various methods that are used to train artificial neural networks
3.2 Perceptron learning rule, gradient descent rule, tuning the learning rate, regularization techniques, and optimization techniques
3.3 Stochastic process, vanishing gradients, transfer learning, and regression
techniques
3.4 Lasso L1 and Ridge L2, unsupervised pre-training, and Xavier initialization
4. DEEP LEARNING LIBRARIES
4.1 Understanding how Deep Learning works
4.2 Activation functions, illustrating perceptron, and perceptron training
4.3 Multi-layer perceptron and the key parameters of perceptron
4.4 TensorFlow introduction and its open-source software library that is used to design, create, and train
4.5 Deep Learning models followed by Google’s tensor processing unit (TPU) programmable AI
4.6 Python libraries in TensorFlow, code basics, variables, constants, and placeholders
4.7 Graph visualization, use-case implementation, Keras, and more
5. KERAS API
5.1 Keras high-level neural network for working on top of TensorFlow
5.2 Defining complex multi-output models
5.3 Composing models using Keras
5.3 Sequential and functional composition and batch normalization
5.4 Deploying Keras with TensorBoard and neural network training process customization
6. TFLEARN API FOR TENSORFLOW
6.1 Using TFLearn API to implement neural networks
6.2 Defining and composing models and deploying TensorBoard
7. DNNS (DEEP NEURAL NETWORKS)
7.1 Mapping the human mind with deep neural networks (DNNs)
7.2 Several building blocks of artificial neural networks (ANNs)
7.3 The architecture of DNN and its building blocks
7.4 Reinforcement learning in DNN concepts, various parameters, layers, optimization of algorithms in DNN, and activation functions
8. CNNS (CONVOLUTIONAL NEURAL NETWORKS)
8.1 What is a convolutional neural network?
8.2 Understanding the architecture and use cases of CNN
8.3 What is a pooling layer? How to visualize using CNN?
8.4 How to fine-tune a convolutional neural network?
8.5 What is transfer learning?
8.6 Understanding recurrent neural networks, kernel filter, feature maps, and pooling, and deploying convolutional neural networks in TensorFlow
9. RNNS (RECURRENT NEURAL NETWORKS)
9.1 Introduction to the RNN model
9.2 Use cases of RNN and modeling sequences
9.3 RNNs with backpropagation
9.4 Long short-term memory (LSTM)
9.5 Recursive neural tensor network theory, the basic RNN cell, unfolded RNN, and dynamic RNN
9.6 Time-series predictions
10. GPU IN DEEP LEARNING
10.1 GPU introduction, how it is different from CPU, and the significance of GPU
10.2 Deep Learning networks and forward pass and backward pass training techniques
10.3 GPU constituent with simpler core and concurrent hardware
11. AUTOENCODERS AND RESTRICTED BOLTZMANN MACHINE (RBM)
11.1 Introduction to RBM and autoencoders
11.2 Deploying RBM for deep neural networks and using RBM for collaborative filtering
11.3 Autoencoders features and applications of autoencoders