 # Dragon Notes UNDER CONSTRUCTION
Latest content:
 Apr 05 Deep Learning Mar 19 Anomaly Detection - ML Mar 13 +Data Tables Mar 08 Clustering - Machine Learning Feb 28 Support Vector Machines - ML Feb 20 Regression - Data Science

# Nonlinear Dynamics & Chaos:Approximation Methods

Euler's Method

 Regular \ds \begin{align}\ \t{Error} \propto \Delta t \hspace{75px} \\ \hline \end{align} $$\ \ x_{n+1} = x_n + f(x_n)\Delta t \vphantom{.^1}\vplup$$ Improved \ds \begin{align}\ \t{Error} \propto (\Delta t)^2 \hspace{180px} \\ \hline \end{align} \begin{align} \bb{1}\ & \widetilde{x}_{n+1} = x_n + f(x_n)\Delta t \vplup \\ \bb{2}\ & x_{n+1} = x_n + \sfrac{1}{2}\left[f(x_n)+f(\widetilde{x}_{n+1})\right] \Delta t \\ \end{align} Fourth-order Runge-Kutta \ds \begin{align}\ \t{Error} \propto (\Delta t)^4 \hspace{180px} \\ \hline \end{align} \ds \begin{align} \bb{1}\ & k_1 = f(x_n)\Delta t \vplup \\ & k_2 = f(x_n + \sfrac{1}{2}k_1)\Delta t \\ & k_3 = f(x_n + \sfrac{1}{2}k_2)\Delta t \\ & k_4 = f(x_n + k_3)\Delta t \\ \bb{2}\ & x_{n+1} = x_n + \sfrac{1}{6}(k_1 + 2k_2 + 2k_3 + k_4) \\ \end{align}

Linearization [2D]

A 2D linear system
$$\nmtx{\dot{x}=}{\dot{y}=}\nmtx{f(x,y)}{g(x,y)}\hspace{100px}$$
with a fixed point $$(x^*,y^*)$$ can be approximated by the linearized system
$$\mtx{\dot{u}}{\dot{v}}=\mtxx{\partial f/\partial x}{\partial f/\partial y}{\partial g/\partial x}{\partial g/\partial y}_{(x^*,y^*)}\mtx{u}{v}\hspace{100px}$$

where $$u=x-x^*$$, $$v=y-y^*$$ denote the components of a small disturbance from the fixed point, and the partial derivatives are evaluted at the fixed point $$(x^*,y^*)$$ (and the resultant matrix is called the Jacobian matrix at the fixed point). See illustrative example in $$\mathrm{[Pr2]}$$
.
asm
Fixed point is a saddle, node, or spiral.
Centers, degenerate nodes, stars, and non-isolated fp's are much more delicate, and can be altered by the neglected small nonlinear terms.