Notice postied on EdPoo's page:
I noticed that you edited out my (correct) version of average, and replaced it with the previously (incorrect) one, labelling my version as obscure. This confirms for me what I have been thinking for some time, that the entire CP project is doomed, unless you and other Sysops get a bit more open-minded, and read the entries that others have written and are prepared to learn, rather than be dogmatic. I defined Average, exactly as I would define it to my students; the point is, that in common parlance, people use average as the middle or most likely of set of data, without actually understanding that the idea is problematic. As it turns out, the most sensible 'middle' is actually the median, and the most common is the mode; however, neither of these are what people normally refer to as "the average"; notmally they calculate it using the arithmetic mean. However that is neither the most common, nor the middle. Can you provide an explanation what the arithmetic mean ACTUALLY is - what is it attempting to measure in everyday language??? It's extremely difficult to explain.
In fact, what the arithmetic mean is, goes something like: "If all the data were the same, but you had the same total as you had before, then the data value you get would be called the average". The aritmetic mean of 10, 20 30 is 20.
The geometric mean you get when you ask the same question for rates of interest (10%, 20% and 30% have a mean of 18.17%), and the harmonic mean you get when you ask the same question about speeds (10kmph, 20kmph, 30kmph have an mean of 16.3kmph) .
If anyone who was editing the average pages knew anything about descriptive statistics, this page should say something a lot different.
You certainly should not be reverting such pages, because you clearly don't have sufficient mathematical understanding to appreciate the nuances.
SeanTheSheep 03:05, 14 May 2007 (EDT)
The point about all of this is that calculations made on data sets are statistical estimates of something or other. If they are not, then there is no point to them. In this particular case, the average is an estimate of the central location or central tendency of the data, i.e. the answer to the question whereabouts is the data located?
A definition of average, which tells someone how to calculate one of the measures for this without explaining what we are trying to do, or discussing what the problems are in trying to capture such a thing is wrong.
The main reason that we almost uniformly use the arithmetic mean as the average is quite difficult to grasp: There is an important result in "Expectation Algebra" which says if you estimate the overall 'expected' mean of a probability distribution by using the arithmetic mean of a data set derived from that distribution, then the expected value of the distribution of arithmetic means is the mean that you started with; i.e. that the arithmetic mean is an unbiased estimator of the distribution mean (it is also a consistent estimator, and, as it turns out the most efficient estimator). Those are the reasons it is used, and that result underpins what is probably the most important theorem in the whole of statistics: The Central Limit Theorem.
However, I am not suggesting that we clutter up the definition with all of that. What I am saying though is that there are lots of different ways of trying to find the middle of a set of data, and that saying that average is (add them up and divide by the number of numbers) is inapproprate on three counts:
- firstly that it ignores what the idea of average is trying to do,
- secondly that it does not discuss whether "average" defined in this manner actually captures the required notion
- thirdly it does not discuss in what situations an arithmetic average is, or is not appropriate.
--SeanTheSheep 04:31, 14 May 2007 (EDT)