Predictive simulations of complex systems are essential for applications ranging from weather forecasting to drug design. The veracity of these predictions hinges on their capacity to capture effective system dynamics. Massively parallel simulations predict the system dynamics by resolving all spatiotemporal scales, often at a cost that prevents experimentation, while their findings may not allow for generalization. On the other hand, reduced-order models are fast but limited by the frequently adopted linearization of the system dynamics and the utilization of heuristic closures. Here we present a novel systematic framework that bridges large-scale simulations and reduced-order models to learn the effective dynamics of diverse, complex systems. The framework forms algorithmic alloys between nonlinear machine learning algorithms and the equation-free approach for modelling complex systems. Learning the effective dynamics deploys autoencoders to formulate a mapping between fine- and coarse-grained representations and evolves the latent space dynamics using recurrent neural networks. The algorithm is validated on benchmark problems, and we find that it outperforms state-of-the-art reduced-order models in terms of predictability, and large-scale simulations in terms of cost. Learning the effective dynamics is applicable to systems ranging from chemistry to fluid mechanics and reduces the computational effort by up to two orders of magnitude while maintaining the prediction accuracy of the full system dynamics. We argue that learning the effective dynamics provides a potent novel modality for accurately predicting complex systems.