Particle-based simulations (such as N-body simulations) usually output data as “snapshots”, meaning a list of the instantaneous values of some properties of all particles in the simulation (e.g. positions and velocities in the N-body case). That often means that the snapshot output frequency is a compromise between the desire to accurately record the temporal behaviour and storage capacity (and I/O overhead). The BTS storage scheme that we published in Cai et al. (2015) is an adaptive storage scheme for simulation data that addresses this issue. It works by minimizing the data redundancy, by assigning individual output frequencies to the data as required. In other words, it provides a fine-grained control of the resolution of scientifically interesting data while suppressing the uninteresting ones.
The video is a part of a gravitational N-body simulation done with the NBODY6++ code, stored using the BTS scheme and visualized with Blender. The simulation has 16,000 point particles, but only 50 of them are shown as large spheres with random color. The effective gravitational force is toward the centre because you have that many particles in a spherical configuration. The reason that at some point they all suddenly change direction is that if the position of a certain particle is unknown when a frame is rendered (as is often the case for “slow” particles in the BTS scheme), Blender guesses the position by interpolating. To test that this interpolation works, I created a large time gap in the data. If particles move only a little bit between snapshots, then this interpolation is fine. But after this large gap, the positions are more or less randomized, so in the video all particles move in straight lines to their new and rather arbitrary positions.