Graduationwoot

Dragon Notes

i

\( \newcommand{bvec}[1]{\overrightarrow{\boldsymbol{#1}}} \newcommand{bnvec}[1]{\overrightarrow{\boldsymbol{\mathrm{#1}}}} \newcommand{uvec}[1]{\widehat{\boldsymbol{#1}}} \newcommand{vec}[1]{\overrightarrow{#1}} \newcommand{\parallelsum}{\mathbin{\|}} \) \( \newcommand{s}[1]{\small{#1}} \newcommand{t}[1]{\text{#1}} \newcommand{tb}[1]{\textbf{#1}} \newcommand{ns}[1]{\normalsize{#1}} \newcommand{ss}[1]{\scriptsize{#1}} \newcommand{vpl}[]{\vphantom{\large{\int^{\int}}}} \newcommand{vplup}[]{\vphantom{A^{A^{A^A}}}} \newcommand{vplLup}[]{\vphantom{A^{A^{A^{A{^A{^A}}}}}}} \newcommand{vpLup}[]{\vphantom{A^{A^{A^{A^{A^{A^{A^A}}}}}}}} \newcommand{up}[]{\vplup} \newcommand{Up}[]{\vplLup} \newcommand{Uup}[]{\vpLup} \newcommand{vpL}[]{\vphantom{\Large{\int^{\int}}}} \newcommand{lrg}[1]{\class{lrg}{#1}} \newcommand{sml}[1]{\class{sml}{#1}} \newcommand{qq}[2]{{#1}_{\t{#2}}} \newcommand{ts}[2]{\t{#1}_{\t{#2}}} \) \( \newcommand{ds}[]{\displaystyle} \newcommand{dsup}[]{\displaystyle\vplup} \newcommand{u}[1]{\underline{#1}} \newcommand{tu}[1]{\underline{\text{#1}}} \newcommand{tbu}[1]{\underline{\bf{\text{#1}}}} \newcommand{bxred}[1]{\class{bxred}{#1}} \newcommand{Bxred}[1]{\class{bxred2}{#1}} \newcommand{lrpar}[1]{\left({#1}\right)} \newcommand{lrbra}[1]{\left[{#1}\right]} \newcommand{lrabs}[1]{\left|{#1}\right|} \newcommand{bnlr}[2]{\bn{#1}\left(\bn{#2}\right)} \newcommand{nblr}[2]{\bn{#1}(\bn{#2})} \newcommand{real}[1]{\Ree\{{#1}\}} \newcommand{Real}[1]{\Ree\left\{{#1}\right\}} \newcommand{abss}[1]{\|{#1}\|} \newcommand{umin}[1]{\underset{{#1}}{\t{min}}} \newcommand{umax}[1]{\underset{{#1}}{\t{max}}} \newcommand{und}[2]{\underset{{#1}}{{#2}}} \) \( \newcommand{bn}[1]{\boldsymbol{\mathrm{#1}}} \newcommand{bns}[2]{\bn{#1}_{\t{#2}}} \newcommand{b}[1]{\boldsymbol{#1}} \newcommand{bb}[1]{[\bn{#1}]} \) \( \newcommand{abs}[1]{\left|{#1}\right|} \newcommand{ra}[]{\rightarrow} \newcommand{Ra}[]{\Rightarrow} \newcommand{Lra}[]{\Leftrightarrow} \newcommand{rai}[]{\rightarrow\infty} \newcommand{ub}[2]{\underbrace{{#1}}_{#2}} \newcommand{ob}[2]{\overbrace{{#1}}^{#2}} \newcommand{lfrac}[2]{\large{\frac{#1}{#2}}\normalsize{}} \newcommand{sfrac}[2]{\small{\frac{#1}{#2}}\normalsize{}} \newcommand{Cos}[1]{\cos{\left({#1}\right)}} \newcommand{Sin}[1]{\sin{\left({#1}\right)}} \newcommand{Frac}[2]{\left({\frac{#1}{#2}}\right)} \newcommand{LFrac}[2]{\large{{\left({\frac{#1}{#2}}\right)}}\normalsize{}} \newcommand{Sinf}[2]{\sin{\left(\frac{#1}{#2}\right)}} \newcommand{Cosf}[2]{\cos{\left(\frac{#1}{#2}\right)}} \newcommand{atan}[1]{\tan^{-1}({#1})} \newcommand{Atan}[1]{\tan^{-1}\left({#1}\right)} \newcommand{intlim}[2]{\int\limits_{#1}^{#2}} \newcommand{lmt}[2]{\lim_{{#1}\rightarrow{#2}}} \newcommand{ilim}[1]{\lim_{{#1}\rightarrow\infty}} \newcommand{zlim}[1]{\lim_{{#1}\rightarrow 0}} \newcommand{Pr}[]{\t{Pr}} \newcommand{prop}[]{\propto} \newcommand{ln}[1]{\t{ln}({#1})} \newcommand{Ln}[1]{\t{ln}\left({#1}\right)} \newcommand{min}[2]{\t{min}({#1},{#2})} \newcommand{Min}[2]{\t{min}\left({#1},{#2}\right)} \newcommand{max}[2]{\t{max}({#1},{#2})} \newcommand{Max}[2]{\t{max}\left({#1},{#2}\right)} \newcommand{pfrac}[2]{\frac{\partial{#1}}{\partial{#2}}} \newcommand{pd}[]{\partial} \newcommand{zisum}[1]{\sum_{{#1}=0}^{\infty}} \newcommand{iisum}[1]{\sum_{{#1}=-\infty}^{\infty}} \newcommand{var}[1]{\t{var}({#1})} \newcommand{exp}[1]{\t{exp}\left({#1}\right)} \newcommand{mtx}[2]{\left[\begin{matrix}{#1}\\{#2}\end{matrix}\right]} \newcommand{nmtx}[2]{\begin{matrix}{#1}\\{#2}\end{matrix}} \newcommand{nmttx}[3]{\begin{matrix}\begin{align} {#1}& \\ {#2}& \\ {#3}& \\ \end{align}\end{matrix}} \newcommand{amttx}[3]{\begin{matrix} {#1} \\ {#2} \\ {#3} \\ \end{matrix}} \newcommand{nmtttx}[4]{\begin{matrix}{#1}\\{#2}\\{#3}\\{#4}\end{matrix}} \newcommand{mtxx}[4]{\left[\begin{matrix}\begin{align}&{#1}&\hspace{-20px}{#2}\\&{#3}&\hspace{-20px}{#4}\end{align}\end{matrix}\right]} \newcommand{mtxxx}[9]{\begin{matrix}\begin{align} &{#1}&\hspace{-20px}{#2}&&\hspace{-20px}{#3}\\ &{#4}&\hspace{-20px}{#5}&&\hspace{-20px}{#6}\\ &{#7}&\hspace{-20px}{#8}&&\hspace{-20px}{#9} \end{align}\end{matrix}} \newcommand{amtxxx}[9]{ \amttx{#1}{#4}{#7}\hspace{10px} \amttx{#2}{#5}{#8}\hspace{10px} \amttx{#3}{#6}{#9}} \) \( \newcommand{ph}[1]{\phantom{#1}} \newcommand{vph}[1]{\vphantom{#1}} \newcommand{mtxxxx}[8]{\begin{matrix}\begin{align} & {#1}&\hspace{-17px}{#2} &&\hspace{-20px}{#3} &&\hspace{-20px}{#4} \\ & {#5}&\hspace{-17px}{#6} &&\hspace{-20px}{#7} &&\hspace{-20px}{#8} \\ \mtxxxxCont} \newcommand{\mtxxxxCont}[8]{ & {#1}&\hspace{-17px}{#2} &&\hspace{-20px}{#3} &&\hspace{-20px}{#4}\\ & {#5}&\hspace{-17px}{#6} &&\hspace{-20px}{#7} &&\hspace{-20px}{#8} \end{align}\end{matrix}} \newcommand{mtXxxx}[4]{\begin{matrix}{#1}\\{#2}\\{#3}\\{#4}\end{matrix}} \newcommand{cov}[1]{\t{cov}({#1})} \newcommand{Cov}[1]{\t{cov}\left({#1}\right)} \newcommand{var}[1]{\t{var}({#1})} \newcommand{Var}[1]{\t{var}\left({#1}\right)} \newcommand{pnint}[]{\int_{-\infty}^{\infty}} \newcommand{floor}[1]{\left\lfloor {#1} \right\rfloor} \) \( \newcommand{adeg}[1]{\angle{({#1}^{\t{o}})}} \newcommand{Ree}[]{\mathcal{Re}} \newcommand{Im}[]{\mathcal{Im}} \newcommand{deg}[1]{{#1}^{\t{o}}} \newcommand{adegg}[1]{\angle{{#1}^{\t{o}}}} \newcommand{ang}[1]{\angle{\left({#1}\right)}} \newcommand{bkt}[1]{\langle{#1}\rangle} \) \( \newcommand{\hs}[1]{\hspace{#1}} \)

  UNDER CONSTRUCTION

Randomness & Probability:
Key Relations




\[\ds \bb{1}\bb{Laplacian}\ \boxed{p_X(x)=\frac{1}{\sqrt{2\sigma^2}}\exp{-\sqrt{\frac{2}{\sigma^2}}|x|},\ -\infty < x < \infty}\]

\[\ds \bb{2}\bb{Cauchy}\ \boxed{p_X(x)=\frac{1}{\pi(1+x^2)},\ \ -\infty < x < \infty}\]
Arises as the PDF of the ratio of two independent \(N(0,1)\) rv's.

\[\ds \bb{3}\bb{Gamma}\ \boxed{{ p_X(k)=\left\{ \begin{array}{c} \begin{align} &\frac{\lambda^a}{\Gamma (\alpha)}x^{\alpha -1}\exp{-\lambda x},\ x\geq 0 \\ &0,\hspace{164px} x<0 \end{align} \end{array} \right.} \Large{|}\ \ns{}\lambda > 0,\ \alpha > 0}\]
\(\ds \bb{P0}\ \Gamma (z+1)=z\Gamma (z) \\ \bb{P1}\ \Gamma (N) = (N-1)! \\ \bb{P2}\ \Gamma (1/2) = \sqrt{\pi} \\ \bb{P3}\ \t{Collapses to exponential for } \alpha=1 \t{:} \\ p_X(k)=\left\{ \begin{array}{c} \begin{align} &\frac{\lambda}{\Gamma (1)}e^{-\lambda x},\ x\geq 0 \\ &0,\hspace{75px} x<0 \end{align} \end{array} \right. \\ \bb{P4}\ \t{Collapses to chi-squared with }N\t{ degrees of freedom for }\alpha = N/2\ \t{and }\lambda = 1/2\t{:}\) \( p_X(k)=\left\{ \begin{array}{c} \begin{align} &\frac{1}{2^{N/2}\Gamma (N/2)}x^{N/2-1}e^{-x/2},\ x\geq 0 \\ &0,\hspace{195px} x<0 \end{align} \end{array} \right. \Large{|}\ns{}\t{ Arises as the PDF of a sum of }N\t{ independent rv's all with the same PDF }N(0,1)\\ \) \(\bb{P5} \t{Collapses to Erlang for }\alpha = N \\ p_X(k)=\left\{ \begin{array}{c} \begin{align} &\frac{\lambda^N}{(N-1)!}x^{N-1}e^{-\lambda x},\ x\geq 0 \\ &0,\hspace{148px} x<0 \end{align} \end{array} \right. \Large{|}\ns{}\t{ Arises as the PDF of a sum of }N\t{ independent exponential rv's all with the same }\lambda \)

\(\ds \bb{4}\bb{Rayleigh}\ \boxed{p_X(x)= \left\{ \begin{array}{c} \begin{align} &\frac{x}{\sigma^2}\exp{\frac{-x^2}{2\sigma^2}},\ x\geq 0 \\ &0,\hspace{114px} x<0 \end{align} \end{array} \right.}\)
\(\vplup \t{ Arises as the squareroot of the sum of the squares of two independent }N(0,1)\t{ rv's w/ an appropriate transformation}\)

\(\boxed{p_Y(y)=p_X(g^{-1}(y))\left| \frac{dg^{-1}(y)}{dy}\right|}\)
\(\vplup^{\vplup}\boxed{p_Y(y)=\sum_{i=1}^{M}p_X(g^{-1}_i(y))\left| \frac{dg^{-1}_i(y)}{dy}\right|}\)
\(\bb{5}\ Y=g(X)\ \) [PDF: one-to-one, many-to-one]\(\vplup\)


\[\boxed{p_{X,Y}(x,y)=\frac{1}{2\pi\sqrt{1-\rho^2}}\t{exp}\left[-\frac{x^2-2\rho xy+y^2}{2(1-\rho^2)}\right],\ \ \begin{matrix} -\infty < x < \infty \\ -\infty < y < \infty \end{matrix}}\]
\(\bb{6}\bb{Standard\ bivariate\ Gaussian\ PDF}\)
  - \(\vplup \rho\ -\) correlation coefficient
  - \(\rho \in (-1,1)\)
  - \(\rho = 0:\) contours are circular, else, elliptical
  - Uncorrelated rv's implies independent rv's; applies to all bivariate Gaussian PDF
  - A linearly transformed bivar. Gaussian r-vector produces another bivar. Gaussian r-vector
  - The contours of a constant PDF \((\rho\ \t{fixed})\) are given by the \((x,y)\) values for which \(x^2-2\rho xy+y^2=r^2\), \(r=\t{const.}\), where the PDF takes
  on the fixed value
\(\hspace{360px} \vplup p_{X,Y}(x,y)=\lfrac{1}{2\pi\sqrt{1-\rho^2}}\t{exp}\left[-\lfrac{r^2}{2(1-\rho^2)}\right]\)


\(\ds \boxed{\t{cov}(X,Y)\equiv E[(X-E[X])(Y-E[Y])]}\)
\(\ds \vplup \boxed{\t{cov}(X,Y) = E_{X,Y}[XY]-E_X[X]E_Y[Y]}\) \(\ds \boxed{\var{X+Y}=\var{X}+\var{Y}+2\cov{X,Y}}\)
\(\ds \boxed{E_{X,Y}[XY]=\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}xyp_{X,Y}(x,y)dxdy}\)

[7][Joint continuous rv covariance]

\[\boxed{\begin{align} & E_{X,Y}[aX+bY]=aE_X[X]+bE_Y[Y] && E_{X,Y}[XY]=E_X[X]E_Y[Y] \\ & E_{X,Y}[ag(X)+bh(Y)]=aE_X[g(X)]+bE_Y[h(Y)] && E_{X,Y}[g(X)h(Y)]=E_X[g(X)]E_Y[h(Y)] \\ & E_X[E_{Y|X}(y|x)]=E_Y[Y] && \end{align}}\]
[8][Expected value properties](joint, continuous rv's)
asm
Uncorrelated & independent rv's in RHS eq's \(\vplup\)

\[\ds \boxed{\rho_{X,Y} = \frac{\t{cov}(X,Y)}{\sqrt{\t{var}(X)\t{var}(Y)}}}\]
[9][Correlation coefficient]

\[\ds \boxed{E[X^2]=\sigma^2 + \mu^2}\]
[10][Expected value properties] (continuous rv's)

\[\ds \boxed{\begin{align} p_{Y|X}(y|x) &= \frac{\vphantom{A^2}p_{X,Y}(x,y)}{p_X(x)} \\ \end{align} }\]
[11][Conditional rv properties] (continuous rv's)

\[\ds \boxed{\begin{align} & p_{X,Y}(x,y) = \frac{\partial^2}{\partial x\partial y}F_{X,Y}(x,y) \\ & p_X(x) = \int_{-\infty}^{\infty}p_{X,Y}(x,y)dy;\quad p_Y(y)=\int_{-\infty}^{\infty}p_{X,Y}(x,y)dx \\ \end{align} }\]
[12][Joint PDF properties] (continuous rv's)

\[\boxed{p_{X,Y}[x_i,y_j]=F_{X,Y}(x_i^+,y_j^+)-F_{X,Y}(x_i^+,y_j^-)-F_{X,Y}(x_i^-,y_j^+)+F_{X,Y}(x_i^-,y_j^-)}\]
[13][PMF-CDF relation]

\[\boxed{p_Y(y)=p_X(g^{-1}(y))\left|\frac{dg^{-1}(y)}{dy}\right|}\]
[14][Single variable PDF transformation]

\(\ds \boxed{p_{W,Z}(w,z)=p_{X,Y}\left(\mathrm{\bf{G^{-1}}}\left[\begin{matrix} w \\ z \end{matrix} \right] \right)|\t{det}(\mathrm{\bf{G^{-1}}})|}\)
\(\ds \boxed{p_{W,Z}(w,z)=p_{X,Y}(g^{-1}(w,z),h^{-1}(w,z))\left|\t{det}\left(\frac{\partial (x,y)}{\partial (w,z)}\right)\right|}\)
[15][Multivariable PDF transformation: linear & nonlinear]
  - \(\frac{\partial (x,y)}{\partial (w,z)} = \left[ \begin{matrix} \begin{align} \partial x/\partial w &\ \ \ \partial x/\partial z \\ \partial y/\partial w &\ \ \ \partial y/\partial z \end{align} \end{matrix} \right]\)

\(\ds \boxed{\Var{\sum_{i=1}^{N}X_i}=\sum_{i=1}^{N}\var{X_i}}\)
[16][Uncorrelated rv properties]
 \(\vplup\)
asm
Uncorrelated rv's

\(\ds \boxed{\bn{C}_X=\left[\mtxxxx {\var{X_1}}{\cov{X_1,X_2}}{...}{\cov{X_1,X_N}} {\cov{X_2,X_1}}{\var{X_2}}{...}{\cov{X_2,X_N}} {\vdots}{\vdots}{\ddots}{\vdots} {\cov{X_N,X_1}}{\cov{X_N,X_2}}{...}{\cov{X_N,X_N}} \right]}\)

\(\ds \boxed{\bn{C}_{X,Y}=\mtxx{\var{X}}{\cov{X,Y}}{\cov{Y,X}}{\var{Y}}}\)\(\ds\hspace{20px}\boxed{\bn{C}_{X,Y}=\mtxx{\sigma_X^2}{\rho_{X,Y}\sigma_X\sigma_Y}{\rho_{Y,X}\sigma_Y\sigma_X}{\sigma_Y^2}}\)
\(\ds\boxed{\widehat{\bn{C}_X}=\frac{1}{M}\sum_{m=1}^{M}\lrpar{\bn{x}_{m}-\widehat{E_{\bn{X}}\bb{X}}}\lrpar{\bn{x}_m-\widehat{E_{\bn{X}}\bb{X}}}^T}\)
[17][Covariance matrix] (\(N\) rv's, 2 rv's)
  \(\vplup\)- Diagonal \(\bn{C}\rightarrow\) uncorrelated rv's
  - \(\bb{17.4}\): \(\widehat{E_{\bn{X}}\bb{X}}=\frac{1}{M}\sum_{m=1}^{M}\bn{x}_m\)
  - \(\bb{17.4}\) useful in numerical approximations

\(\ds \boxed{\vph{\frac{a}{b}}p_{X[n_1+n_0],X[n_2+n_0],...,X[n_N+n_0]}=p_{X[n_1],X[n_2],...,X[n_N]}}\)
[18][Stationarity property]
  - all IID random processes are stationary, and so are their expected values

\(\ds \boxed{ p_{\bn{X}}(\bn{x})=\frac{1}{(2\pi)^{N/2}\t{det}^{1/2}(\bn{C})}\t{exp}\lrbra{-\frac{1}{2}\lrpar{\bn{x}-\bn{\mu}^T\bn{C}^{-1}(\bn{x}-\bn{\mu})}} }\)
\(\boxed{\bn{\mu}=[\mu_1\ \mu_2\ ...\ \mu_N]^T=E_{\bn{X}}[\bn{X}]=\lrbra{E_{X_1}[X_1]\ E_{X_2}[X_2]\ \cdots\ E_{X_N}[X_N]}^T}\)
\(\dsup\boxed{\bn{X}\sim\mathcal{N}(\bn{\mu},\bn{C})}\)
[19][Multivariate Gaussian PDF]
  - Diagonal \(\bn{C} \rightarrow \bn{X}\) are uncorrelated and independent. ⚠ Holds only for multivariate Gaussian PDF - even if marginal PDFs are Gaussian

\(\ds \boxed{H(f)=\mathcal{H}(z)\vert_{z=\t{exp}(j2\pi f)}}\)
[20][\(H(f)\)-\(\mathcal{H(z)}\) relation]

\(\ds\boxed{\hat{S}[n_0]=\sum_{k=-\infty}^{\infty}h[k]X[n_0-k]}\)\(\ds \quad\boxed{H_{\t{opt}}(f)=\frac{P_S(f)}{P_S(f)+P_W(f)}}\)\(\quad\boxed{h_{\t{opt}}[n]=\mathcal{F}^{-1}(H_{\t{opt}}(f))}\)
\(\ds\boxed{\t{mse}_{\t{min}}=r_S[0]-\sum_{k=-\infty}^{\infty}h_{\t{opt}}[k]r_S[k]}\)\(\ds\quad\boxed{\t{mse}_{\t{min}}=\int_{-1/2}^{1/2}\frac{P_S(f)}{1+\rho(f)}df}\)
[S21][Wiener smoothing filter]: Estimator, freq. response, optimal freq. response., min. mean square error; \(\rho(f)=P_S(f)/P_W(f)\)
Wiener smoothing filter Wiener smoothing example


Dragon Notes,   Est. 2018     About

By OverLordGoldDragon