## § Extended euclidian algorithm

#### § Original definition

I feel like I only understand the extended euclidian algorithm algebraically, so I've been trying to understand it from other perspectives. Note that we want to find $gcd(p, p')$. WLOG, we assume that $p > p'$. We now find a coefficient $d$ such that $p = d p' + r$ where $0 \leq r < p'$. At this stage, we need to repeat the algorithm to find $gcd(p', r)$. We stop with the base case $gcd(p', 0) = p'$.

#### § My choice of notation

I dislike the differentiation that occurs between $p$, $p'$, and $r$ in the notation. I will notate this as $gcd(p_0, p_1)$. WLOG, assume that $p_0 > p_1$. We now find a coefficient $d_0$ such that $p_0 = p_1 d_1 + p_2$ such that $0 \leq p_2 < p_1$. At this stage, we recurse to find $gcd(p_1, p_2)$. We stop with the base case $gcd(p_{n-1}, p_n) = p_n$ iff $p_{n-1} = d_n p_n + 0$. That is, when we have $p_{n+1} = 0$, we stop with the process, taking the gcd to be $p_n$.

#### § Matrices

We can write the equation $p_0 = p_1 d_1 + p_2$ as the equation $p_0 = [d_1; 1] [p_1; p_2]^T$. But now, we have lost some state. We used to have both $p_1$ and $p_2$, but we are left with $p_0$. We should always strive to have "matching" inputs and output so we can iterate maps. This leads us to the natural generalization:
$\begin{bmatrix} p_0 \\ p_1 \end{bmatrix} = \begin{bmatrix} d_1 & 1 \\ 1 & 0 \end{bmatrix} \begin{bmatrix} p_1 \\ p_2 \end{bmatrix}$
So at the first step, we are trying to find a $(d_1, p_2)$ such that the matrix equation holds, and $0 \leq p_2 < d_1$.

#### § Extended euclidian division

When we finish, we have that $gcd(p_{n-1}, p_n) = p_n$. I'll call the GCD as $g$ for clarity. We can now write the GCD as a linear combination of the inputs $g = p_n = 0 \cdot p_{n-1} + 1 \cdot p_n$. We now want to backtrack to find out how to write $g = \omega_{n-2} p_{n-2} + \omega_{n-1} p_{n-1}$. And so on, all the way to $g = \omega_0 p_0 + \omega_1 p_1$.
• We can begin with the general case:
$g = \begin{bmatrix} \omega' & \omega'' \end{bmatrix} \begin{bmatrix} p' \\ p'' \end{bmatrix}$
• We have the relationship:
$\begin{bmatrix} p \\ p' \end{bmatrix} = \begin{bmatrix} d' & 1 \\ 1 & 0 \end{bmatrix} \begin{bmatrix} p' \\ p'' \end{bmatrix}$
• We can invert the matrix to get:
$\begin{bmatrix} p' \\ p'' \end{bmatrix} = \begin{bmatrix} 0 & 1 \\ 1 & - d' \end{bmatrix} \begin{bmatrix} p \\ p' \end{bmatrix}$
• now combine with the previous equation to get:
\begin{aligned} &g = \begin{bmatrix} \omega' & \omega'' \end{bmatrix} \begin{bmatrix} p' \\ p'' \end{bmatrix} \\ &g = \begin{bmatrix} \omega' & \omega'' \end{bmatrix} \begin{bmatrix} 0 & 1 \\ 1 & - d' \end{bmatrix} \begin{bmatrix} p \\ p' \end{bmatrix} \\ &g = \begin{bmatrix} \omega'' & \omega - d' \omega' \end{bmatrix} \begin{bmatrix} p \\ p' \end{bmatrix} \\ \end{aligned}

#### § Fractions

• GCD factorization equation: $p/p' = (\alpha p' + p'')/p' = \alpha + p''/p'$.
• Bezout equation: $\omega' p' + \omega'' p'' = g$.
• Bezout equation as fractions $\omega' + \omega'' p''/p' = g/p'$.