Likelihood Of Success (LOS) in the real world

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

AlvaroBegue
Posts: 931
Joined: Tue Mar 09, 2010 3:46 pm
Location: New York
Full name: Álvaro Begué (RuyDos)

Re: Likelihood Of Success (LOS) in the real world

Post by AlvaroBegue »

Laskos wrote:Even so, as non-mathematician, I find information ("probabilities") inferred from Bayesian reasoning more intuitive in making either quantitative or qualitative statements about probabilistic events than the degree of confidence in rejecting Null hypothesis. Besides that, rejecting Null hypothesis should be made with care, with the experiment pre-set to certain constraints.

Suppose a man came to you with a coin, and said "whenever heads come up I win a dollar, whenever tails come up you win a dollar". You believe the coin is fair, and start the game. Your prior for the coin is the following:

Image

Based on that you estimate the LOS of the coin at 50.0%
After 5 tosses, the result came unfavorably, 5 heads, 0 tails.
Based on that you estimate the LOS of the coin at 55.3%

So you begin to suspect that the coin is unfair, although your suspicion is mild (55.3%). You take another, mild prior for the coin, a bit favoring heads (the man proposing the game):

Image

Based on new prior, you re-interpret the LOS of the coin after the same 5-0 as before (no more tosses) at 98.9%. So you come to easily interpretable conclusion that the man is cheating you. Based on mild suspicion and mild prior.

That's for me is better than playing with more rigorous but vague Null hypothesis (which, in this particular case, wouldn't tell anything "rigorously and scientifically").
I just want to make a note on naming conventions. The first graph is mislabelled. Instead of "likelihood" it should have said something like "prior probability density", except it's not normalized to have integral 1.

Likelihood is the probability of observing the data as a function of the parameter, which is precisely what the second plot shows.

Bayes's formula tells you how to combine prior probability and likelihood (multiply and rescale so the integral is 1) to obtain the posterior probability.
User avatar
Laskos
Posts: 10948
Joined: Wed Jul 26, 2006 10:21 pm
Full name: Kai Laskos

Re: Likelihood Of Success (LOS) in the real world

Post by Laskos »

AlvaroBegue wrote:
Laskos wrote:Even so, as non-mathematician, I find information ("probabilities") inferred from Bayesian reasoning more intuitive in making either quantitative or qualitative statements about probabilistic events than the degree of confidence in rejecting Null hypothesis. Besides that, rejecting Null hypothesis should be made with care, with the experiment pre-set to certain constraints.

Suppose a man came to you with a coin, and said "whenever heads come up I win a dollar, whenever tails come up you win a dollar". You believe the coin is fair, and start the game. Your prior for the coin is the following:

Image

Based on that you estimate the LOS of the coin at 50.0%
After 5 tosses, the result came unfavorably, 5 heads, 0 tails.
Based on that you estimate the LOS of the coin at 55.3%

So you begin to suspect that the coin is unfair, although your suspicion is mild (55.3%). You take another, mild prior for the coin, a bit favoring heads (the man proposing the game):

Image

Based on new prior, you re-interpret the LOS of the coin after the same 5-0 as before (no more tosses) at 98.9%. So you come to easily interpretable conclusion that the man is cheating you. Based on mild suspicion and mild prior.

That's for me is better than playing with more rigorous but vague Null hypothesis (which, in this particular case, wouldn't tell anything "rigorously and scientifically").
I just want to make a note on naming conventions. The first graph is mislabelled. Instead of "likelihood" it should have said something like "prior probability density", except it's not normalized to have integral 1.

Likelihood is the probability of observing the data as a function of the parameter, which is precisely what the second plot shows.

Bayes's formula tells you how to combine prior probability and likelihood (multiply and rescale so the integral is 1) to obtain the posterior probability.
Both plots are priors (or "prior probability densities"), I mislabeled them. 55.3% and 98.9% are posterior probabilities using these two different priors.