Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.
Path: blob/main/deep-learning-specialization/course-1-neural-networks-and-deep-learning/Week 2 Quiz - Neural networks basics.md
Views: 34199
Week 1 Quiz - Introduction to deep learning
1. What does a neuron compute?
A neuron computes a linear function followed by an activation function.
A neuron computes the mean of all features before applying the output to an activation function.
A neuron computes an activation function followed by a linear function z = Wx + b.
A neuron computes a function g that scales the input x linearly (Wx + b)
2. Which of thes is the "Logistic Loss"?
L^{(i)}(\hat{y}^{(i)},y^{(i)})=-(y^{(i)}\log{\hat{y}^{(i)}})+(1-y^{(i))\log{1-\hat{y}^{(i)})
...
š
3. Consider the Numpy array x:
What is the shape of x?
(4, )
(1, 2, 2)
(2, 2, 1)
(2, 2)
4. Consider the following random arrays a and b, and c:
What will be the shape of ?
The computation cannot happen because it is not possible to bradcast more than one dimension
c.shape = (3,3)
c.shape = (2,1)
c.shape = (2,3,3)
5. Consider the two following random arrays a and b:
What will be the shape of ?
c.shape = (3,3)
The computation cannot happen because the sizes don't match.
c.shape = (1,3)
The computation cannot happen because it is not possible to broadcast more than one dimension.
6. Suppose you have n_x, input features per example. If we decide to use row vector x_j, for the features and X = What is the dimension of X?
(1,n_x)
(m,n_x)
(n_x,n_x)
(n_x,m) š Each has dimension , is built stacking all rows together into a array.
7. Recall that np.dot(a,b) performs a matrix multiplication on a and b, whereas a * b performs an element-wise multiplication. Consider two following random arrays a and b:
What is the shape of c?
c.shape = (112288, 150)
The computation cannot happen because the sizes don't match. It's going to be "Error"!
c.shape = (150, 150)
c.shape = (12288,45)
8. Consider the following code snippet:
How do you vectorize this?
c = a*b
c = np.dot(a,b)
c = a.T*b
c = a*b.T
9. Consider the following code:
What will be c? (If you're not sure, feel free to run this in python to find out).
This will be multiply a 3x3 matrix a with a 3x1 vector, thus resulting in a 3x1 vector. That is, c.shape = (3,1).
It will lead to an error since you cannot use "*" to operate on these two matrices. You need to instead use np.dot(a,b)
This will invoke broadcasting, so b is copied three times to become (3,3), and * is an element-wise product so c.shape will be (3,3)
This will invoke broadcasting, so b is copied three times to become(3,3), and * invokes a matrix multiplication operation of two 3x3 matrices so c.shape will be (3,3)
10. Consider the following computational graph.
What is the output of J?
a^2 - b^2
...
š