# Parameter Estimation

Quantitative finance often boils down to parameter estimation: given some model which you believe to be true, what are the parameters for that model that fit the actual data?

# Parameter Estimation

One type of parameter estimation is maximum likelihood estimation. The MLE of a distribution given some data has an intuitive definition: it is the parameter(s) for the model that would make observing the given data most likely.

What is the MLE for the success probability of a geometric distribution which takes an observed $$n=4$$ trials to achieve the first success?

# Parameter Estimation

You have a coin which you know lands on heads with probability $$P.$$ If it is flipped 10 times and there are 6 heads, what is the MLE for $$P?$$

# Parameter Estimation

In the last question, you might think “that seems like a bad estimate, since most coins are essentially fair”. This is where having a prior distribution of beliefs (“a priori”) can be extremely helpful.

# Parameter Estimation

You have a coin which you know lands on heads with probability $$P.$$ You believe that P is normally distributed with mean 0.5 and variance 0.01. If it is flipped 10 times and there are 6 heads, which is the best estimate for $$P?$$

Technically speaking, we're looking for a "maximum a posteriori probability" (MAP). It's like an MLE, except that we Bayesian update from a non-uniform prior distribution for the parameter.

Also, $$P$$ is a probability, which means that it must be bounded between 0 and 1. Assuming a normal distribution for $$P$$ assigns a nonzero probability of finding a value for $$P$$ outside of the interval $$[0,1].$$ This assumption is only an approximation, which is very good for appropriate choices of mean and variance.

# Parameter Estimation

An airline has numbered their planes $$1,2,\ldots,N,$$ and you observe the following 3 planes, which are randomly sampled from the $$N$$ planes:

What is the maximum likelihood estimate for $$N?$$ In other words, what value of $$N$$ would, according to conditional probability, make your observation most likely?

# Parameter Estimation

In the previous problem, we saw an example where our estimator seemed poor and we would have preferred the unbiased estimator: one which has an expected value equal to the true value of the parameter.

# Parameter Estimation

When calculating a “sample variance,” you divide by a different constant instead of $$n$$ in order to get an unbiased estimator. What is that constant?

×