You must be logged in to see worked solutions.

Already have an account? Log in here.

One type of parameter estimation is maximum likelihood estimation. The MLE of a distribution given some data has an intuitive definition: it is the parameter(s) for the model that would make observing the given data most likely.

What is the MLE for the success probability of a geometric distribution which takes an observed \(n=4\) trials to achieve the first success?

You must be logged in to see worked solutions.

Already have an account? Log in here.

You must be logged in to see worked solutions.

Already have an account? Log in here.

You have a coin which you know lands on heads with probability \(P.\) You believe that P is normally distributed with mean 0.5 and variance 0.01. If it is flipped 10 times and there are 6 heads, which is the best estimate for \(P?\)

Technically speaking, we're looking for a "maximum a posteriori probability" (MAP). It's like an MLE, except that we Bayesian update from a non-uniform prior distribution for the parameter.

You must be logged in to see worked solutions.

Already have an account? Log in here.

An airline has numbered their planes \(1,2,\ldots,N,\) and you observe the following 3 planes, which are randomly sampled from the \(N\) planes:

What is the maximum likelihood estimate for \(N?\) In other words, what value of \(N\) would, according to conditional probability, make your observation most likely?

You must be logged in to see worked solutions.

Already have an account? Log in here.

**unbiased** estimator: one which has an expected value equal to the true value of the parameter.

You must be logged in to see worked solutions.

Already have an account? Log in here.

×

Problem Loading...

Note Loading...

Set Loading...