short takeaway: it’s really not all it’s cracked up to be
1 — IQ or Intelligent Quotient has no useful definition
The usual definition, one used by textbooks, is that an IQ is what the IQ test measures. Not the greatest definition but I have never seen a better one.
Here, for example, is a 2020 definition from Wikipedia:
There are many kinds of intelligence and many kinds of cognitive tests. Each test measures only some of these intelligences — and only at one given point in time. People grow, people develop, people change….so do test scores.
Yes, I do know there is some correlation between some tests over some periods of time, but I also know there are many other factors to weigh in.
2 — When reporting IQ scores, the standard deviation is universally ignored
Standard deviation is the variance around a mean or average. Every measurement has a standard deviation, not just IQ measurements.
Most human measurements fall into what is called a bell curve or a normal curve — like the one in the image above.
In that curve, 100 is always the average IQ score. Each black mark on the line is one standard deviation. I added the scores that would go with each standard deviation.
For every IQ score, the set standard deviation is 15 — so your IQ is your test score +/- 15 — but we almost never read that.
For example, if your score on an IQ test came out to be 120 — your actual IQ could range from 105 to 135.
Yes — it could be anywhere within that 30 points! With the labels we use for the test numbers, you could…