In this episode we talk about the paper "Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer, Azalia Mirhoseini, Krzysztof Maziarz, Andy Davis, Quoc Le, Geoffrey Hinton, Jeff Dean.
All content for Argmax is the property of Vahe Hagopian, Taka Hasegawa, Farrukh Rahman and is served directly from their servers
with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
In this episode we talk about the paper "Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer, Azalia Mirhoseini, Krzysztof Maziarz, Andy Davis, Quoc Le, Geoffrey Hinton, Jeff Dean.
Todays paper: Can Neural Nets Learn the Same Model Twice? Investigating Reproducibilityand Double Descent from the Decision Boundary Perspective (https://arxiv.org/pdf/2203.08124.pdf)Summary:A discussion of reproducibility and double descent through visualizations of decision boundaries.Highlights of the discussion:Relationship between model performance and reproducibilityWhich models are robust and reproducibleHow they calculate the various scores
Argmax
In this episode we talk about the paper "Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer, Azalia Mirhoseini, Krzysztof Maziarz, Andy Davis, Quoc Le, Geoffrey Hinton, Jeff Dean.