LEASGD: an Efficient and Privacy-Preserving Decentralized Algorithm for Distributed Learning.

2018
Distributed learningsystems have enabled training large-scale models over large amount of data in significantly shorter time. In this paper, we focus on decentralized distributed deep learning systems and aim to achieve differential privacywith good convergence rate and low communication cost. To achieve this goal, we propose a new learning algorithm LEASGD (Leader-Follower Elastic Averaging Stochastic Gradient Descent), which is driven by a novel Leader-Follower topology and a differential privacymodel.We provide a theoretical analysis of the convergence rate and the trade-off between the performance and privacy in the private setting.The experimental results show that LEASGD outperforms state-of-the-art decentralized learning algorithm DPSGD by achieving steadily lower loss within the same iterations and by reducing the communication cost by 30%. In addition, LEASGD spends less differential privacybudget and has higher final accuracy result than DPSGD under private setting.
    • Correction
    • Source
    • Cite
    • Save
    11
    References
    15
    Citations
    NaN
    KQI
    []
    Baidu
    map