[ibayesclub.beta] seminar by Gergely Neu, Firday, 25th of June, from 4 pm
Szabo, B.T.
b.t.szabo at vu.nl
Wed Jun 23 07:28:44 CEST 2021
Dear all,
Before the summer really starts, we have a very interesting invited speaker in our ``thematic seminar'' this Friday (the 25th of June). Although not Bayesian, I thought to advertise it here nevertheless as several of you might be interested in it.
Speaker: Gergely Neu (Universitat Pompeu Fabra, http://cs.bme.hu/~gergo/)
Time: Friday June 25, 16.00-17.00
Zoom link: https://uva-live.zoom.us/j/81805477265
Meeting ID: 818 0547 7265
Title: Information-Theoretic Generalization Bounds for Stochastic Gradient Descent
Abstract: We study the generalization properties of the popular stochastic gradient descent method for optimizing general non-convex loss functions. Our main contribution is providing upper bounds on the generalization error that depend on local statistics of the stochastic gradients evaluated along the path of iterates calculated by SGD. The key factors our bounds depend on are the variance of the gradients (with respect to the data distribution) and the local smoothness of the objective function along the SGD path, and the sensitivity of the loss function to perturbations to the final output. Our key technical tool is combining the information-theoretic generalization bounds previously used for analyzing randomized variants of SGD with a perturbation analysis of the iterates.
Please also join for online drinks after the talk.
Best wishes from the seminar organizers:
Tim van Erven
Botond Szabo
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://listserver.vu.nl/pipermail/ibayesclub.beta/attachments/20210623/e432a9c6/attachment.html>
More information about the ibayesclub.beta
mailing list