1. First write out the jacobians as matrices with lots of (unknown) partial derivatives. 2. Work out what each of those partial derivatives is. (The partial derivates are just standard calc 1 derivatives). 1. First write out the jacobians as matrices with lots of (unknown) partial derivatives. 2. Work out what each of those partial derivatives is. (The partial derivates are just standard calc 1 derivatives). 1 Multi-layer Perceptrons Next week you will create, train, and test a fully connected neural network (a.k.a. MLP). Your MLP will have 64 neurons in its hidden layer and 10 neurons in its output layer. Each neuron should have a bias variable. The hidden layers will use a sigmoid activation function. The output layer will apply a soft-max function across the 10 neurons. See Figure 1. Figure 1: Network Architecture 1.1 Derive the Jacobians The Jacobian of a multivariate, vector-valued function f:RnRm, is a matrix J=x1f1x1fmxnf1xnfm.With some slight abuse of notation, let's call this Jacobian xf. Derive the following Jacobian matrices and indicate the dimensions of each. Hint: Vectorize the matrices W1 and W2. 1. b2a2 2. W2a2 3. f1a2 4. a1f1 5. b 1a1 6. W1a1 As an example, the Jacobian a2L is a 110 matrix with elements a2L=[y^1y1,y^2y2,y^ 10y10].