# Precision Versus Accuracy

After starting at the above image[^1] for way to long in a futile attempt to derive a common characteristic between any two of its examples, here's their actual definition.
I find it easier to look at them in terms of measurements.
* **Precision** measures "how deterministic" or conversely "how random" a measurement is:
* A _precise_ measurement method will always return _the same value_.
* An _imprecise_ measurement method's return will _fluctuate_
* _Precision_, however, does not say how _close_ the returned value to the actual value is.
* **Accuracy** measures how _close a measurement is to the actual value_.
* An _accurate_ measurement is a measurement that is more or less the actual value
* An _inaccurate_ measurement is a measurement that is far away from the actual value
If we look at a set of multiple measures, we could say:
* **Precision** is the _error of the mean_ of the measurements
* **Accuracy** is the _variance_ of the measurements.
The image above is misleading. Accuracy and precision seem to be "somewhat" unrelated, but it's not quite clear how, and to what extent. I think even the author is not clear on that:
* If you define "accuracy" to be the _error of the mean_ (as I'd suggest), then the top left image is also relatively accurate - in particular, way more accurate than the image on the top right. The mean of the 7 points is about at the bull's eye.
* On the other hand, if you define accuracy to be the _mean error_, then high accuracy implies high precision, and the bottom left image is misleading, because it has relatively high precision, but apparently not high enough to warrant the title "high precision".
I prefer the former definition, since it makes the term's meanings orthogonal. With this definition, the corrected image looks like this:

[^1]: Source: https://www.researchgate.net/figure/Precision-versus-accuracy-The-bullseye-represents-the-true-value-eg-the-true_fig6_304674901