Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Path: blob/main/deep-learning-specialization/course-1-neural-networks-and-deep-learning/Week 4 Quiz - Deep Neural Networks.md
Views: 34200
Week 4 Quiz - Deep Neural Networks
1. We use the "cache" in our implementation of forward and backward propagation to pass useful values to the next layer in the forwar propergation. True/False?
True
False
2. Which of the following are "parameters" of a neural network? (Check all that apply.)
the weight matrices
the bias vector.
the number of layers of the neural network.
the activation functions.
3. Which of the following statements is true?
The deeper layers of a neural network are typically computing more complex features of the input than the earlier layers.
The earlier layers of a neural network are typically computing more complex features of the input than the deeper layers.
4. We can not use vectorization to calculate in backpropagation, we must use a for loop over all the examples.
True
False
5. Assume we store the values for in an array called layer_dims, as follows: layer_dims = [,4,3,2,1]. So layer 1 has four hidden units, layer 2 has 3 hidden units, and so on. Which of the following for-loop will allow you to initialize the parameters for the model?
for i range(len(layer_dims)-1:
parameter['W' + str(i+1)] = np.random.randn(layer_dims[i], layer_dims[i+1]) * 0.01
parameter['b' + str(i+1)] = np.random.randn(layer_dims[i+1], 1) * 0.01...
š This iterates over 0, 1, 2, 3 and assigns to the shape ().
6. Consider the following neural network:
What are all the values of and ?
4, 4, 3, 2, 1
...
š The are the number of units in each layer, notice that .
7. If L is the number of layers of a neural network then . True/False?
True, The gradient of the output layer depends on the difference between the value computed during the forward propagation process and the target values.
False, The gradient of the output layer depends on the difference between the value computed during the forward propagation process and the target values.
8. There are certain functions with the following properties:
True
False
9. Consider the following 2 hidden layers neural network:
Which of the following statements are true? (Check all the apply).
will have shape (4,3)
will have shape (3,1)
will have shape (4,3)
will have shape (3,4)
š More generally, the shape of is ().
will have shape (3,4)
will have shape (1,3)
will have shape (1,3)
will have shape (3,1)
š More generally, the shape of is ().
will have shape (4,1)
10. Whereas the previous question used a specific network, in the general case what is the dimension of , the weight matrix associated with layer ?
has shape ()
...