A thread to explore the deep profound mysteries of probability and statistics.

We all instinctively use the arithmetic mean as an estimaer of a true value. In the first physics 101 lab we measerd a metal block with a micromter. We had to take multiple measurements, then compute the mean and standard deviation.

Such process tend to be normally disturbed because the samples, in this case measurements, are random. One measurement does not affect the next.

Intuitively you can look at the normal distribution or bell curve. The mean value is the midpoint and represents the highest probability, or he best estimator of the true value.

In mechanics the first moment of inertia of an object is the center of gravity .

In statistics the first moment is the mean. Analogous to the COG in mechanics.

The theoretical basis is Maximum Likelihood Estimators. Scroll down and the best estimator for a normal PDF is the arithmetic mean, and the variance is the root mean square of the data.

https://www.statlect.com/fundamental...mum-likelihood
https://www.statlect.com/fundamental...mum-likelihood

An alternative is moment generating functions.

https://en.wikipedia.org/wiki/Method...s_(statistics)