Artificial Intelligence 🤖
Sampling & Estimation

Sampling & Estimation

Point Estimates

Use Maximum Likelihood Estimation (MLE) to estimate model parameters from sampled data.

Confidence Intervals

Relationship between confidence intervals (CI) and sampling distributions:

Formulae for Confidence Intervals ( Cl)\mathrm{Cl}) :

  1. CI for μ\mu : Normal RVs, Known Variance (σ2)\left(\sigma^{2}\right)

P[zα/2Xˉμσ/nzα/2]=(1α)P\left[-z_{\alpha / 2} \leq \frac{\bar{X}-\mu}{\sigma / \sqrt{n}} \leq z_{\alpha / 2}\right]=(1-\alpha)

Sampling Distribution == Normal

  1. Cl\mathrm{Cl} for μ\mu : Any RV, UnKnown Variance (σ2),n\left(\sigma^{2}\right), \mathrm{n} \rightarrow \infty

P[zα/2XˉμS/nzα/2](1α)P\left[-z_{\alpha / 2} \leq \frac{\bar{X}-\mu}{S / \sqrt{n}} \leq z_{\alpha / 2}\right] \approx(1-\alpha)

Sampling Distribution == Normal

  1. Cl for μ\mu : Normal RVs, Unknown Variance (σ2)\left(\sigma^{2}\right)

P[tα/2,n1XˉμS/ntα/2,n1]=(1α)P\left[-t_{\alpha / 2, n-1} \leq \frac{\bar{X}-\mu}{S / \sqrt{n}} \leq t_{\alpha / 2, n-1}\right]=(1-\alpha)

Sampling Distribution == Student tt

  1. Cl for σ2\sigma^{2} : Normal RVs

P[χ1α/2,n12(n1)S2σ2χα/2,n12]=(1α)P\left[\chi_{1-\alpha / 2, n-1}^{2} \leq \frac{(n-1) S^{2}}{\sigma^{2}} \leq \chi_{\alpha / 2, n-1}^{2}\right]=(1-\alpha)

Sampling Distribution == Chi-squared

This chapter marks the beginning of the statistics portion of the course. Thus far we have dealt entirely with probability theory and the application of common distributions to model experimental data. In all the examples considered, we were always given the model parameters, which are summarised in Table 1 for the distributions considered thus far.

This chapter is concerned with estimating values for the model parameters based on data that we can sample and quantifying their uncertainty using methods from statistics. For the probability models we have discussed in the course, we can see that for many there is an interesting correspondence between the expectation E[X]E[X] and the model parameter(s); we shall formally derive the relationship in this chapter.

ModelPMF/PDFfX(x\mathbf{P M F} / \mathbf{P D F} f_{X}(x \mid parameters ))RangeE[X]E[X]Var[X]\operatorname{Var}[X]
UniformfX(xn)=1nf_{X}(x \mid n)=\frac{1}{n}x=1,,nx=1, \ldots, nn+12\frac{n+1}{2}n2112\frac{n^{2}-1}{12}
BernoullifX(xp)=px(1p)(1x)f_{X}(x \mid p)=p^{x}(1-p)^{(1-x)}x=0,1x=0,1ppp(1p)p(1-p)
BinomialfX(xn,p)=(nx)px(1p)nxf_{X}(x \mid n, p)=\left(\begin{array}{l}n \\ x\end{array}\right) p^{x}(1-p)^{n-x}x=0,1,,nx=0,1, \ldots, nnpn pnp(1p)n p(1-p)
 Poisson  (Binomial approximation) \begin{array}{c}\text { Poisson } \\ \text { (Binomial approximation) }\end{array}fX(xn,p)=enp(np)x/x!f_{X}(x \mid n, p)=e^{-n p}(n p)^{x} / x !x=0,1,,nx=0,1, \ldots, nnpn pnpn p
 Poisson  (Average occurrences) \begin{array}{c}\text { Poisson } \\ \text { (Average occurrences) }\end{array}fX(xλ)=eλλx/x!f_{X}(x \mid \lambda)=e^{-\lambda} \lambda^{x} / x !x=0,1,2,x=0,1,2, \ldotsλ\lambdaλ\lambda
 Poisson  (Average rate) \begin{array}{c}\text { Poisson } \\ \text { (Average rate) }\end{array}fX(xλ^,T)=eλ^T(λ^T)xx!f_{X}(x \mid \hat{\lambda}, T)=\frac{e^{-\hat{\lambda} T}(\hat{\lambda} T)^{x}}{x !}x=0,1,2,x=0,1,2, \ldotsλ^T\hat{\lambda} Tλ^T\hat{\lambda} T
ExponentialfW(wλ^)=λ^eλ^wf_{W}(w \mid \hat{\lambda})=\hat{\lambda} e^{-\hat{\lambda} w}w>0w>01λ^\frac{1}{\hat{\lambda}}1λ^2\frac{1}{\hat{\lambda}^{2}}
NormalfX(xμ,σ)=12πσ2e(xμ)2/2σ2f_{X}(x \mid \mu, \sigma)=\frac{1}{\sqrt{2 \pi \sigma^{2}}} e^{-(x-\mu)^{2} / 2 \sigma^{2}}<x<-\infty<x<\inftyμ\muσ2\sigma^{2}
 Standard Normal  (No parameters) \begin{array}{l}\text { Standard Normal } \\ \text { (No parameters) }\end{array}fZ(z)=12πez2/2f_{Z}(z)=\frac{1}{\sqrt{2 \pi}} e^{-z^{2} / 2}<x<-\infty<x<\infty01

Table 1: Summary of probability distributions and their properties.