This question provides a simple two-layer neural network with a hidden layer of three neurons, an output layer of one neuron, and two input neurons. In both the hidden layer and the output layer, the sigmoid function is considered as an activation function. X = [ x 1 , x 2 ] is a vector that the network receives as input. A parameter vector called W ( 1 ) = [ w 1 , w 2 , w 3 , w 4 , w 5 , w 6 ] connects the input layer to the hidden layer. A parameter vector W ( 2 ) = [ w 7 , w 8 , w 9 ] is the parameter that connects the hidden layer to the output layer (examples shown in the figure below). (a) Write the mathematical expression for the output of the hidden layer, including the bias term. (10\%) (b) Write the mathematical expression for the output of the output layer, including the bias term. ( 10% ) (c) The true label for this input is y = 0 . The cost function used in this network is the mean squared error (MSE). Write the mathematical expression for the cost for this input. (10\%) (d) Use backpropagation to calculate the gradient descent of the cost with respect to the weights connecting the input layer to the hidden layer [ w 1 , w 2 , w 3 , w 4 , w 5 , w 6 ] and the weights connecting the hidden layer to the output layer [ w 7 , w 8 , w 9 ] . ( 10% ) .