Over 40 years ago, Orszag pointed out the importance of dealiasing in the pseudospectral method. However, the computational and storage cost of dealiasing, either by padding or phase-shift dealiasing, is so great that it is still sometimes neglected in strongly damped turbulence simulations: this is typically justified with a claim that high-wavenumber damping is sufficiently strong so that dealiasing error contributes negligibly to the larger energy scales.
On the other hand, Hou and Li demonstrated in 2007 that high-order Fourier smoothing captures nearly singular solutions of the 1D inviscid Burgers equations and the 3D Euler equations more accurately and efficiently than explicit dealiasing via 2/3 zero padding.
Given that high-Reynolds number turbulence, with well-resolved inertial ranges, falls midway between these two limiting cases of large viscosity vs. vanishing viscosity, it seems prudent to reconfirm the importance of properly dealiasing turbulence simulations. Moreover, the recent introduction of implicit dealiasing techniques, which in two and three dimensions are roughly twice as fast as explicit dealiasing, should more than offset the claim that smoothing via a Fourier filter is 20% more efficient than dealiasing.
In this talk, we revisit the issue of dealiasing, beginning with a review of recent advances in the pseudospectral method. We emphasize that implicit dealiasing outperforms zero padding by decoupling the data and temporary work arrays. We also discuss the parallelized implementations, for distributed and shared memory architectures, now available in our open-source library FFTW++.