Top | Newest First | Oldest First
DrWatson · 70-79, M
Except, the normal distribution (which is the distribution for a continuous random variable) is being used only to approximate the distribution for IQ score, which is a discrete random variable. (It's possible values are positive integers)
So, when doing a textbook probability problem, the probability of selecting someone with an IQ of 100 is seen to be approximately zero. But in reality, the probability is positive, and is equal to the ratio :
(number of people with scores of 100) / (total number of IQ scores in the world)
A miniscule number, of course, but not zero, since the denominator is finite.
So, when doing a textbook probability problem, the probability of selecting someone with an IQ of 100 is seen to be approximately zero. But in reality, the probability is positive, and is equal to the ratio :
(number of people with scores of 100) / (total number of IQ scores in the world)
A miniscule number, of course, but not zero, since the denominator is finite.
Northwest · M
@Luke73 Not yes and no. Dr. Watson is correct.
Your proposition is a theoretical mathematical property that doesn't invalidate the usefulness of specific IQ scores.
In practice, IQ scores are not infinitely precise measurements - they're typically rounded to whole numbers. When we say someone has an "IQ of 100," we're really talking about a small range around 100 (perhaps 99.5 to 100.49).
The probability of falling within a range in a normal distribution is not zero - it's the integral of the probability density function over that range, which gives us a positive probability.
This is similar to how we measure height - while the probability of someone being exactly 6'.2".000000... tall is technically zero, we can meaningfully talk about someone being 6'-2" tall by considering the practical range of measurement.
While the theoretical probability of an exact point value in a continuous distribution, IQ scores in practice represent small ranges, which do have non-zero probabilities of occurring. This is why we can meaningfully say someone has an IQ of 100.
Your proposition is a theoretical mathematical property that doesn't invalidate the usefulness of specific IQ scores.
In practice, IQ scores are not infinitely precise measurements - they're typically rounded to whole numbers. When we say someone has an "IQ of 100," we're really talking about a small range around 100 (perhaps 99.5 to 100.49).
The probability of falling within a range in a normal distribution is not zero - it's the integral of the probability density function over that range, which gives us a positive probability.
This is similar to how we measure height - while the probability of someone being exactly 6'.2".000000... tall is technically zero, we can meaningfully talk about someone being 6'-2" tall by considering the practical range of measurement.
While the theoretical probability of an exact point value in a continuous distribution, IQ scores in practice represent small ranges, which do have non-zero probabilities of occurring. This is why we can meaningfully say someone has an IQ of 100.
Luke73 · 22-25, M
@Northwest By putting them in buckets, it's not a normal distribution anymore, which condratics the assumption, that's another model then. Which model is more practical is debateable, of course.
The integral over the density function from 100 to 100 is 0 and thus the probability is 0. You get a positive result if you integrate over a larger interval.
The integral over the density function from 100 to 100 is 0 and thus the probability is 0. You get a positive result if you integrate over a larger interval.
What? 😳🤔😂
Er... no. What you've said doesn't even make sense.
One hundred was selected to be the average value with IQ tests in much the same way that zero was chosen as the freezing temperature of water, and 100 the standard boiling point (I am, of course, talking about the Celsius temperature scale, and not that weird Fahrenheit one).
In other words, it's completely arbitrary, and since the very definition of "average" is the most common value expressed within any given population, it's the value you're most likely to encounter.
Er... no. What you've said doesn't even make sense.
One hundred was selected to be the average value with IQ tests in much the same way that zero was chosen as the freezing temperature of water, and 100 the standard boiling point (I am, of course, talking about the Celsius temperature scale, and not that weird Fahrenheit one).
In other words, it's completely arbitrary, and since the very definition of "average" is the most common value expressed within any given population, it's the value you're most likely to encounter.
Shybutwilling2bfriends · 61-69
Ok if you say so
Luke73 · 22-25, M
@Shybutwilling2bfriends It's not only because I say so, it's an objectively true statement.
Great mathematical argument, but it doesn't really matter
nedkelly · 61-69, M
Especially white women in the USA
4meAndyou · F
Math deficient, here. 😁😁😁