Let Y1,Y2,...,Yn be a sample from a Gamma distribution with probability density function.f(y) = .where beta > 0 What is the relevant likelihood? (Be sure you get this correct since your answers to the other parts of the problem depend on this likelihood) Show that is sufficient for beta . Find the maximum likelihood estimator for beta Find the minimum variance unbiased estimator of beta . Solution (a) Likelihood is given by product over all i=1 to i=n, (beta)^(-2)y_ie^(-y_i/beta) =(beta)^(-2n) prod y_i e^(-sum(y_i)/beta) (b) Hence sum(y_i) is sufficient statistics for beta since term related with beta is sum(y_i), from factorization theorem. (c) Maximum likelihood estimator for beta is obtained by log likelihood is -2n log(beta)+sum(log(y_i))-sum(y_i)/beta differentiating we get, -2n/beta+sum(y_i)/beta^2 and equating to zero 2n=sum(y_i)/beta => beta=sum(y_i)/2n (d) since sum(y_i) is sufficient statisics, any minimum variance unbaised estimator will be a function of sum(y_i), E(sum(y_i))=2(beta)n (from mean of gamma distribution) => E(sum(y_i)/2n)=beta Hence UMVUE of beta is sum(y_i)/2n..