# Notes for Bayesian Statistics Fall, 2023 ## Discrete prior ## Continuous prior via conjugacy ### Binomial-Beta model ### Poisson-Gamma model ### Normal-normal model (known variance) Likelihood: $$Y_i \sim \mathcal{N}(\mu, \sigma^2)$$ Prior: $$\mu \sim \mathcal{N}(m, s^2)$$ Posterior: $$ \begin{align} p(\mu \mid y_1, \ldots, y_n) & \propto p(y_1, \ldots, y_n \mid \mu) p(\mu) \\ &= \left( \prod_{i=1}^n p(y_i \mid \mu)\right) p(\mu) \\ &= \left( \prod_{i=1}^n \frac{1}{\sqrt{2\pi\sigma^2}} e^{-\frac{1}{2\sigma^2} (y_i - \mu)^2} \right)\frac{1}{\sqrt{2\pi s^2}} e^{-\frac{1}{2 s^2} (\mu - m)^2} \\ &\propto \left( \prod_{i=1}^n e^{-\frac{1}{2\sigma^2} (y_i - \mu)^2} \right) e^{-\frac{1}{2 s^2} (\mu - m)^2} \\ &= e^{-\frac{1}{2\sigma^2} \sum_i (y_i - \mu)^2 -\frac{1}{2 s^2} (\mu - m)^2} \\ &= \end{align} $$ ### Normal-normal model (unknown variance) ## Posterior predictive Simulating from posterior predictive Model checking ## Tuning priors