11. Posterior Using Gibbs' Sampler

Approx. Using Gibbs' Sampler

Example

Parameters: σ2 and μ
Likelihood: P(yi|μ,σ2) is N(μ,σ2)
Prior: p(μ,σ2)=ind.p(μ)p(σ2)

p(μ)=1 and p(σ2)=1σ2=(σ2)1 Jeffreys' Prior

p({y})=[12π]n/2(σ2)n/2exp{[(yμ)22][1σ2]}p(μ,σ2|{y})(σ2)n/21exp{[(yμ)22][1σ2]}

Key idea: we have a bivariate probability distribution.

Want to find the "full" conditional for σ2:

p(σ2|{y},μ)inv. gamma(r=n2,v=(yμ)22)

"Full" conditional for μ:

p(μ|{y},σ2)exp{[(yμ)22][1σ2]}exp{((yy¯)+(y¯μ))22σ}exp{(yy¯)2+2(yy¯)(y¯μ)+(y¯μ)22σ}exp{(yy¯)2+n(y¯μ)22σ}exp{12(σ2n)[μy¯]2}

N(mean=y¯,variance=σ2n)

General Idea:
Each step in algorithm sample from each univariate distribution for each parameter and simultaneously update.

1.sample ϕ1(s)p(ϕ1ϕ2(s1),ϕ3(s1),,ϕp(s1))2.sample ϕ2(s)p(ϕ2ϕ1(s),ϕ3(s1),,ϕp(s1))p.sample ϕp(s)p(ϕpϕ1(s),ϕ2(s),,ϕp1(s))

Which is a type of Markov Chain Monte Carlo

Pr(ϕ(s)A)Ap(ϕ)dϕas s.