Arnold diffusion

In applied mathematics, Arnold diffusion is the phenomenon of instability of nearly-integrable Hamiltonian systems. The phenomenon is named after Vladimir Arnold who was the first to publish a result in the field in 1964.[1][2] More precisely, Arnold diffusion refers to results asserting the existence of solutions to nearly-integrable Hamiltonian systems that exhibit a significant change in the action variables.

Arnold diffusion describes the diffusion of trajectories due to the ergodic theorem in a portion of phase space unbound by any constraints (i.e. unbounded by Lagrangian tori arising from constants of motion) in Hamiltonian systems. It occurs in systems with more than N=2 degrees of freedom, since the N-dimensional invariant tori do not separate the 2N-1 dimensional phase space any more. Thus, an arbitrarily small perturbation may cause a number of trajectories to wander pseudo-randomly through the whole portion of phase space left by the destroyed tori.

  1. ^ Arnold, Vladimir I. (1964). "Instability of dynamical systems with several degrees of freedom". Soviet Mathematics. 5: 581–585.
  2. ^ Florin Diacu; Philip Holmes (1996). Celestial Encounters: The Origins of Chaos and Stability. Princeton University Press. p. 193. ISBN 0-691-00545-1.

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search