Following the seminal work of Nesterov, accelerated optimization methods
(sometimes referred to as momentum methods) have been used to powerfully
boost the performance of first-order, gradient-based parameter
estimation in scenarios were second-order optimization strategies are
either inapplicable or impractical. Not only does accelerated gradient
descent converge considerably faster than traditional gradient descent,
but it performs a more robust local search of the parameter space by
initially overshooting and then oscillating back as it settles into a
final configuration, thereby selecting only local minimizers with a
attraction basin large enough to accommodate the initial overshoot. This
behavior has made accelerated search methods particularly popular within
the machine learning community where stochastic variants have been
proposed as well. So far, however, accelerated optimization methods
have been applied to searches over finite parameter spaces. We show how
a variational framework for these finite dimensional methods (recently
formulated by Wibisono, Wilson, and Jordan) can be extended to the
infinite dimensional setting and, in particular, to the manifold of
planar curves in order to yield a new class of accelerated geometric,
PDE-based active contours.