The propagation of cosmic rays (CRs) in turbulent interstellar magnetic fields is typically described as a spatial diffusion process. This formalism predicts only a small deviation from an isotropic CR distribution in the form of a dipole in the direction of the CR density gradient or relative background flow. We show that the existence of a global CR dipole moment necessarily generates a spectrum of higher multipole moments in the local CR distribution. These anomalous anisotropies are a direct consequence of Liouville's theorem in the presence of a local turbulent magnetic field. We show that the predictions of this model are in excellent agreement with the observed power spectrum of multi-TeV CRs.