# Dragon Notes

UNDER CONSTRUCTION
Latest content:
 Apr 05 Deep Learning Mar 19 Anomaly Detection - ML Mar 13 +Data Tables Mar 08 Clustering - Machine Learning Feb 28 Support Vector Machines - ML Feb 20 Regression - Data Science

# Randomness & Probability:Caveats

[Fair but dependent]

: Two fair coin tosses are not necessarily independent
Consider a fair toss experiment with a penny and a nickel, with tails mapped to 0 and heads to 1. Assume the joint PMF as below:$$\vplup$$
 $$j=0\hspace{35px} j=1$$ $$p_X[i]$$ $$\begin{matrix}i=0\\i=1\end{matrix}$$ $$\begin{matrix}\frac{3}{8}\\ \frac{1}{8}\end{matrix}\hspace{50px}$$ $$\begin{matrix}\frac{1}{8}\\ \frac{3}{8}\end{matrix}$$ $$\begin{matrix}\frac{1}{2}\\ \frac{1}{2}\end{matrix}$$ $$p_Y[j]$$ $$\frac{1}{2}\hspace{60px} \frac{1}{2}$$

Examining the PMF confirms the coins are fair since $$p=1/2$$ - however, as $$p_{X,Y}[0,0]=3/8\neq (1/2)(1/2)=p_X[0]p_Y[0]$$, $$X$$ and $$Y$$ are dependent.

[Zero covar $$\neq$$ indep]

: Independence implies zero covariance but zero covariance does not imply independence

$$E[g(X)]\neq g(E[X])$$

If $$\vplup g(X)=X^2$$, then $$E[g(X)]=E[X^2]=6$$ but $$g(E[X])=(E[X])^2=4\neq E[g(X)]$$.

[Not all PMFs have expected values]

Consider the PMF $$\ds p_X[k]=\frac{4/\pi^2}{k^2},\ \ k=1,2,...$$ Attempting to find the expected value produces
$$\ds E[X]=\frac{4}{\pi^2}\sum_{k=1}^{\infty}\frac{1\vphantom{1^A}}{k}\rightarrow \infty$$

[Assessing independence - mind PDF domain]

The joint PDF given by $$\vplup$$
\ds p_{X,Y}(x,y)=\left\{ \begin{array}{c} \begin{align} & 2\t{exp}[-(x+y)], && x\geq 0,\ y\geq 0,\t{ and } y < x \\ & 0, && \t{otherwise} \end{align} \end{array} \right.

is not factorable; the region in the $$xy$$-plane where $$p_{X,Y}(x,y)\neq 0$$ cannot be written as $$g(x)h(y)$$.

[Covariance vs. correlation]

 Covariance $$=$$ the expected product of deviations of two rv's from their individual expected values Correlataion $$=$$ the strength of linear association between two rv's - Measures how much variables change together   - Has dimensions; $$[\t{cov}] = [\t{var}_1][\t{var}_2]$$   - Positive covariance $$\hspace{5px}=$$ $$\uparrow (\t{var}_1)\ \rightarrow\ \uparrow (\t{var}_2)$$   - Negative covariance $$=$$ $$\uparrow (\t{var}_1)\ \rightarrow\ \downarrow (\t{var}_2)$$   - Independent rv's $$\rightarrow \t{cov}=0$$ - Is dimensionless   - Independent rv's $$\not\rightarrow \t{corr}=0$$

[Uncorr & indep]

Uncorrelated implies independence only for multivariate Gaussian PDF, even if marginal PDF's are Gaussian

[Stationarity & realizations]

It is impossible to determine if a random process is stationary from a single realization
$$\vplup$$We cannot determine if a coin is fair by observing that one of the tosses was a head. Multiple realizations of the coin tossing experiment are required - so is with random processes.