5. Suppose X1, . . . , Xn are i.i.d. samples from Bernoulli(p) distribution, i.e., Pr(Xi = 1) = p = 1 − Pr(Xi = 0).
A. Find the maximum likelihood estimate (mle) of p.
B. What is the mle of θ = p(1 − p)?
C. Calculate the bias of the mle of θ.
5. Suppose X1, . . . , Xn are i.i.d. samples from Bernoulli(p) distribution, i.e., Pr(Xi...
Let X1,... Xn i.i.d. random variable with the following riemann density: with the unknown parameter θ E Θ : (0.00) (a) Calculate the distribution function Fo of Xi (b) Let x1, .., xn be a realization of X1, Xn. What is the log-likelihood- function for the parameter θ? (c) Calculate the maximum-likelihood-estimator θ(x1, , xn) for the unknown parameter θ
Exercice 6. Let be (Xi,..., Xn) an iid sample from the Bernoulli distribution with parameter θ, ie. I. What is the Maximum Likelihood estimate θ of θ? 2. Show that the maximum likelihood estimator of θ is unbiased. 3. We're looking to cstimate the variance θ (1-9) of Xi . x being the empirical average 2(1-2). Check that T is not unli ator propose an unbiased estimator of θ(1-0).
Advanced Statistics, I need help with (c) and (d) 2. Let X1, X2, ..., Xn be a random sample from a Bernoulli(6) distribution with prob- ability function Note that, for a random variable X with a Bernoulli(8) distribution, E [X] var [X] = θ(1-0) θ and (a) Obtain the log-likelihood function, L(0), and hence show that the maximum likelihood estimator of θ is 7l i= I (b) Show that dE (0) (c) Calculate the expected information T(e) EI()] (d) Show...
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
3. (10 points) Suppose that Xi,..., Xn are i.i.d. from Bernoulli(p). Show that the product XIX2X3X4 is an unbised estimator of p4, and find UMVUE of p1.
Suppose X1, X2, , Xn is an iid sample from a uniform distribution over (θ, θΗθ!), where (a) Find the method of moments estimator of θ (b) Find the maximum likelihood estimator (MLE) of θ. (c) Is the MLE of θ a consistent estimator of θ? Explain.
Let X1, X2, · · · Xn be a i.i.d. sample from Bernoulli(p) and let . Show that Yn converges to a degenerate distribution at 0 as n → ∞.
Let X1, ..., Xn be a random sample (i.i.d.) from a normal distribution with parameters µ, σ2 . (a) Find the maximum likelihood estimation of µ and σ 2 . (b) Compare your mle of µ and σ 2 with sample mean and sample variance. Are they the same?
Question 3: Bernoulli distribution (23/100 points) Consider a random sample X1,...,Xn from a Bernoulli distribution with unknown parameter p that describes the probability that Xi is equal to 1. That is, Bernoulli(p), i = 1, ..., n. (10) The maximum likelihood (ML) estimator for p is given by ÔML = x (11) n It holds that NPML BIN(n,p). (12) 3.a) (1 point) Give the conservative 100(1 – a)% two-sided equal-tailed confidence interval for p based on ÔML for a given...
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known that p = ΣΧ; is an unbiased estimator for p. n 1. Find E(@2) and show that p2 is a biased estimator for p. (Hint: make use of the distribution of X, and the fact that Var(Y) = E(Y2) – E(Y)2) 2. Suggest an unbiased estimator for p2. (Hint: use the fact that the sample variance is unbiased for variance.) Xi+2...