2. Bayes Inference for Discrete Random Variables

Bayes' Universe and Table

Lets say we would like to guess # of red balls := N given the amount observed in a sample := y

Bayesian Universe: 2-dimensional (N,y)

Vertical dimension: possible values for parameter (unobservable)
Horizontal dimension: sample values (observable)

priory1yjyJx1g(x1)f(x1,y1)f(x1,yj)f(x1,yJ)xig(xi)f(xi,y1)f(xi,yj)f(xi,yJ)xIg(xI)f(xI,y1)f(xI,yj)f(xI,yJ)f(y1)f(yj)f(yJ)

Reduced after seeing Y=yj

(x1,yj)(xi,yj)(xI,yj)

D is observed data
Ai are partitions

ScenarioPrior P(Ai)Likelihood P(D|Ai)Joint P(DAi)Posterior P(Ai|D)A1P(A1)P(D|A1)P(DA1)=P(D|A1)P(A1)P(D|A1)P(A1)P(D)A2P(A2)P(D|A2)P(DA2)=P(D|A2)P(A2)P(D|A2)P(A2)P(D)AnP(An)P(D|An)P(DAn)=P(D|An)P(An)P(D|An)P(An)P(D)1i=1nP(DAi)=P(D)1

Before new observations our Posteriors become our Priors

For Multiple Trials Which We Can Classify as Success or Failure:
We use the binomial distribution

f(x)=(nx)px(1p)nx(nx)=n!x!×(nx)!

to calculate our likelihood.

Parameter of interest: Proportion (π)
Prior Distribution: Discrete probability distributions
Likelihood: Binomial
Posterior Distribution: Discrete probability distribution using Bayes' theorem

Expectation

Ai are the outcomes that we are modelling

i=1n"Model"Posterior=i=1nAiP(Ai|D)