# Accuracy vs. Precision

Many people use the terms "**accuracy**" and "**precision**" interchangeably. However, this is not the correct thing to do, as the two terms have different definitions.

Accuracy:How close are the observed results to the expected results?

Precision:How close is each result to the next result?

While it is possible for a statement to be both accurate and precise, it is often the case that the statement is one or the other.

If I was given the sum $2+2$ and were to provide the answer "between $0$ and $10$," my answer would beaccurate, and it is certainly correct. However, the statement isnot preciseas my answer has a range of $10$.If instead I was to provide the answer "exactly $7$," my answer would be

precise, as the answer has an uncertainty of $0$. However, the answer isnot accurate, as $2 + 2 = 4$.

A $\chi^{2}$ test is performed for dice rolling.

Results are shown below:

Side 1: 380 times

Side 2: 400 times

Side 3: 410 times

Side 4: 390 times

Side 5: 395 times

Side 6: 405 times.The expected results would be that side 1 : side 2 : side 3 : side 4 : side 5 : side 6 will be in the ratio 1:1:1:1:1:1.

The $\chi^{2}$ test shows that there is no significant difference between the expected and observed results. Does this statistic indicate good accuracy, precision, both, or neither?

Since the observed results are close to the expected results, this is an indication of good accuracy. However, more statistical analysis should be done to determine whether the precision is good, but generally the numbers do look quite close to each other. $_\square$

**Cite as:**Accuracy vs. Precision.

*Brilliant.org*. Retrieved from https://brilliant.org/wiki/accuracy-vs-precision/