%0 Conference Paper %T Consensus Control for Decentralized Deep Learning %S Proceedings of Machine Learning Research %V 139 %I PMLR %P 5686-5696 %W https://arxiv.org/abs/2102.04828 %U https://proceedings.mlr.press/v139/kong21a.html %X Decentralized training of deep learning models enables on-device learning over networks, as well as efficient scaling to large compute clusters. Experiments in earlier works reveal that, even in a data-center setup, decentralized training often suffers from the degradation in the quality of the model: the training and test performance of models trained in a decentralized fashion is in general worse than that of models trained in a centralized fashion, and this performance drop is impacted by parameters such as network size, communication topology and data partitioning. %G en %B Proceedings of the 38th International Conference on Machine Learning %A Kong, Lingjing %A Lin, Tao %A Koloskova, Anastasia %A Jaggi, Martin %A Stich, Sebastian U %D 2021