Coursera – Neural Networks and Deep Learning
Week 1
- ReLU function: Rectified Linear Unit
- Convolutional Neural Networks (CNNs) are mostly used in images/visuals applications.
- Recurrent Neural Networks (RNNs) are mostly used in one-dimensional time series or one-dimensional temporal sequence data.
m= size of the training set, i.e., number of training examples.
Week 2
-
When implementing a neural network, you usually want to process your entire training set without using an explicit for loop over the entire training set of
mtraining examples. -
During the computation of a neural network, another idea is to perform a forward pass (forward propagation step) followed by a backward pass (backward propagation step).
-
In the forward pass, we compute the output of the neural network, and in the backward pass, we compute gradients/derivatives.
Logistic Regression
Logistic Regression is an algorithm for binary classification.
[Refer to lecture Week-2 lecture material file C1_W2.pdf]
Week 4
Parameters vs Hyperparameters
-
Parameters: weights (w) and biases (b)
-
Hyperparameters: (parameters that control w and b)
- learning rate (),
- number of iterations (in the learning algorithm)
- number of hidden layers (in a neural network)
- number of hidden units (units at each layer)
- type of activation function (e.g., ReLU, Sigmoid, tanh, etc., especially at hidden layers)
There could be more additions to this list of hyperparameters. Often need to try a lot of possible settings for these hyperparameters while training a neural network.