%AData, Deepesh%ADiggavi, Suhas%BJournal Name: IEEE International Symposium on Information Theory (ISIT)
%D2020%I
%JJournal Name: IEEE International Symposium on Information Theory (ISIT)
%K
%MOSTI ID: 10313168
%PMedium: X
%TOn Byzantine-Resilient High-Dimensional Stochastic Gradient Descent
%XWe study stochastic gradient descent (SGD) in the master-worker architecture under Byzantine attacks. Building upon the recent advances in algorithmic high-dimensional robust statistics, in each SGD iteration, master employs a non-trivial decoding to estimate the true gradient from the unbiased stochastic gradients received from workers, some of which may be corrupt. We provide convergence analyses for both strongly-convex and non-convex smooth objectives under standard SGD assumptions. We can control the approximation error of our solution in both these settings by the mini-batch size of stochastic gradients; and we can make the approximation error as small as we want, provided that workers use a sufficiently large mini-batch size. Our algorithm can tolerate less than 1/3 fraction of Byzantine workers. It can approximately find the optimal parameters in the strongly-convex setting exponentially fast, and reaches to an approximate stationary point in the non-convex setting with linear speed, i.e., with a rate of 1/T, thus, matching the convergence rates of vanilla SGD in the Byzantine-free setting.
%0Journal Article
Country unknown/Code not availablehttps://doi.org/10.1109/ISIT44484.2020.9174363OSTI-MSA