%AChen, Y.%AHuang, W.%ANguyen, L.%AWeng, T.-W.%D2021%I
%K
%MOSTI ID: 10336943
%PMedium: X
%TOn the Equivalence between Neural Network and Support Vector Machine
%XRecent research shows that the dynamics of an infinitely wide neural network (NN)
trained by gradient descent can be characterized by Neural Tangent Kernel (NTK)
[27]. Under the squared loss, the infinite-width NN trained by gradient descent
with an infinitely small learning rate is equivalent to kernel regression with NTK
[4]. However, the equivalence is only known for ridge regression currently [6],
while the equivalence between NN and other kernel machines (KMs), e.g. support
vector machine (SVM), remains unknown. Therefore, in this work, we propose
to establish the equivalence between NN and SVM, and specifically, the infinitely
wide NN trained by soft margin loss and the standard soft margin SVM with NTK
trained by subgradient descent. Our main theoretical results include establishing
the equivalence between NN and a broad family of L2 regularized KMs with finite width
bounds, which cannot be handled by prior work, and showing that every
finite-width NN trained by such regularized loss functions is approximately a KM.
Furthermore, we demonstrate our theory can enable three practical applications,
including (i) non-vacuous generalization bound of NN via the corresponding
KM; (ii) nontrivial robustness certificate for the infinite-width NN (while existing
robustness verification methods would provide vacuous bounds); (iii) intrinsically
more robust infinite-width NNs than those from previous kernel regression.