Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Whoops, poor wording on my part.

Imagine flipping a perfectly fair coin 100 times. You'd expect to see 50 heads, but you don't always -- it's just an average. Suppose you see 75 heads. What is the chance that you'd see 75 heads with a fair coin? Very very small. The chance that randomness could produce such a result is small.

Now, imagine you test 100 perfectly fair coins. A few of them give more than 75 heads, just by luck. You conclude they're unfair, since the result is unlikely otherwise. The chance that randomness produced the effects you saw is actually 100%, because all the coins are fair.

There's a difference between the question "How likely is this outcome to happen if the coin is fair?" and "Given that this outcome happened, how likely is it that the coin is fair?" Statistical significance addresses the first question, not the second.



Thanks; I feel like I learned something. Your infinite patience was appreciated.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: