$H_n \equiv \sum_{i=1}^n 1/i$. Then, we show that:
$\begin{aligned}
\lim_{n \rightarrow \infty} \left| H_{n+1} - H_n \right|
&= \left| \frac{1}{n+1} - \frac{1}{n} \right| \\
&= \frac{1}{(n+1)n} \rightarrow 0
\end{aligned}$

$\begin{aligned}
&\lim_{n \rightarrow \infty} \left| H_{2n} - H_n \right| \\
&= \left|\frac{1}{2n} - \frac{1}{n} \right| \\
&= \sum_{i=n+1}^{2n} \frac{1}{n+1} + \frac{1}{n+2} + \dots + \frac{1}{2n} \\
&\geq \sum_{i=n+1}^{2n} \frac{1}{2n} + \frac{1}{2n} + \dots + \frac{1}{2n} \\
&\geq \frac{n}{2n} = \frac{1}{2} \neq 0 \text{on} x \rightarrow \infty
\end{aligned}$

#### § Memorable solution: logarithm

We can much more simply choose $a_n = \log(n)$. This yields the simple
calculation:
$\begin{aligned}
&\lim_{n \rightarrow \infty} a_{n+1} - a_n = \log(n+1) - \log(n) \\
&= \log((n+1)/n)) \\
&= \log(1 + 1/n) \xrightarrow{n \rightarrow \infty} \log(1) = 0
\end{aligned}$

while on the other hand,
$\begin{aligned}
\lim_{n \rightarrow \infty} a_{2n} - a_n
= \log(2n) - \log(n)
= \log(2) + \log(n) - \log(n)
= \log 2 \neq 0
\end{aligned}$

I find this far cleaner conceptually, since it's "obvious" to everyone
that $a_n = \log(n)$ diverges, while the corresponding fact for $H_n$
is hardly convincing. We also get straightforward equalities everywhere,
instead of inequalities.
I still feel that I don't grok what precisely fails here, in that, my intuition
still feels that the local condition *ought* to imply the Cauchy condition:
if $a_n$ tells $a_{n+1}$ to not be too far, and $a_{n+1}$ tells $a_{n+2}$,
surely this *must* be transitive?
I have taught my instincts to not trust my instincts on analysis, which is a
shitty solution :) I hope to internalize this someday.
*EDIT:* I feel I now understand what's precisely happening
after ruminating a bit.
The Cauchy convergence criterion allows us to drop a finite number
of terms, and then capture *everything after that point* in a ball
of radius $\epsilon$. As $\epsilon$ shrinks, *all* the terms in the
sequence are "squeezed togeher".
In the $a_{n+1} - a_n$ case, only successive terms must maintain
an $\epsilon$ distance. But as the $\log$ example shows, you can steadily
plod along, keeping $\epsilon$ ball next to $\epsilon$ ball, to reach:
$\lim_{n \rightarrow \infty} \lim_{\epsilon \rightarrow 0} f(n) \cdot \epsilon$

whose behaviour can do unexpected things depending on the choice of $n$.