Hall cascades versus instabilities in neutron star magnetic fieldsC. J. Wareing and R. Hollerbach
Department of Applied Mathematics, University of Leeds, Leeds, LS2 9JT, UK
Received 12 October 2009 / Accepted 2 December 2009
Context. The Hall effect is an important nonlinear mechanism affecting the evolution of magnetic fields in neutron stars. Studies of the governing equation, both theoretical and numerical, have shown that the Hall effect proceeds in a turbulent cascade of energy from large to small scales.
Aims. We investigate the small-scale Hall instability conjectured to exist from the linear stability analysis of Rheinhardt and Geppert.
Methods. Identical linear stability analyses are performed to find a suitable background field to model Rheinhardt and Geppert's ideas. The nonlinear evolution of this field is then modelled using a three-dimensional pseudospectral numerical MHD code. Combined with the background field, energy was injected at the ten specific eigenmodes with the greatest positive eigenvalues as inferred by the linear stability analysis.
Results. Energy is transferred to different scales in the system, but not into small scales to any extent that could be interpreted as a Hall instability. Any instabilities are overwhelmed by a late-onset turbulent Hall cascade, initially avoided by the choice of background field, but soon generated by nonlinear interactions between the growing eigenmodes. The Hall cascade is shown here, and by several authors elsewhere, to be the dominant mechanism in this system.
Key words: magnetohydrodynamics (MHD) -- turbulence -- stars: magnetic fields -- stars: neutron -- stars: evolution -- pulsars: general
© ESO 2009