Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
TV & Film
History
Technology
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts221/v4/75/db/da/75dbda7a-9c02-a923-c9a8-ac50c4a94f59/mza_7528638632772517919.jpg/600x600bb.jpg
Data Science Decoded
Mike E
33 episodes
2 weeks ago
We discuss seminal mathematical papers (sometimes really old 😎 ) that have shaped and established the fields of machine learning and data science as we know them today. The goal of the podcast is to introduce you to the evolution of these fields from a mathematical and slightly philosophical perspective. We will discuss the contribution of these papers, not just from pure a math aspect but also how they influenced the discourse in the field, which areas were opened up as a result, and so on. Our podcast episodes are also available on our youtube: https://youtu.be/wThcXx_vXjQ?si=vnMfs
Show more...
Mathematics
Science
RSS
All content for Data Science Decoded is the property of Mike E and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
We discuss seminal mathematical papers (sometimes really old 😎 ) that have shaped and established the fields of machine learning and data science as we know them today. The goal of the podcast is to introduce you to the evolution of these fields from a mathematical and slightly philosophical perspective. We will discuss the contribution of these papers, not just from pure a math aspect but also how they influenced the discourse in the field, which areas were opened up as a result, and so on. Our podcast episodes are also available on our youtube: https://youtu.be/wThcXx_vXjQ?si=vnMfs
Show more...
Mathematics
Science
https://d3t3ozftmdmh3i.cloudfront.net/staging/podcast_uploaded_nologo/41505637/41505637-1720347263425-80b9b83d77589.jpg
Data Science #34 - The deep learning original paper review, Hinton, Rumelhard & Williams (1985)
Data Science Decoded
46 minutes 37 seconds
1 month ago
Data Science #34 - The deep learning original paper review, Hinton, Rumelhard & Williams (1985)

On the 34th episode, we review the 1986 paper, "Learning representations by back-propagating errors" , which was pivotal because it provided a clear, generalized framework for training neural networks with internal 'hidden' units. The core of the procedure, back-propagation, repeatedly adjusts the weights of connections in the network to minimize the error between the actual and desired output vectors. Crucially, this process forces the hidden units, whose desired states aren't specified, to develop distributed internal representations of the task domain's important features.This capability to construct useful new features distinguishes back-propagation from earlier, simpler methods like the perceptron-convergence procedure. The authors demonstrate its power on non-trivial problems, such as detecting mirror symmetry in an input vector and storing information about isomorphic family trees. By showing how the network generalizes correctly from one family tree to its Italian equivalent, the paper illustrated the algorithm's ability to capture the underlying structure of the task domain.Despite recognizing that the procedure was not guaranteed to find a global minimum due to local minima in the error-surface , the paper's clear formulation (using equations 1-9 ) and its successful demonstration of learning complex, non-linear representations served as a powerful catalyst.


It fundamentally advanced the field of connectionism and became the standard, foundational algorithm used today to train multi-layered networks, or deep learning models, despite the earlier, lesser-known work by Werbos

Data Science Decoded
We discuss seminal mathematical papers (sometimes really old 😎 ) that have shaped and established the fields of machine learning and data science as we know them today. The goal of the podcast is to introduce you to the evolution of these fields from a mathematical and slightly philosophical perspective. We will discuss the contribution of these papers, not just from pure a math aspect but also how they influenced the discourse in the field, which areas were opened up as a result, and so on. Our podcast episodes are also available on our youtube: https://youtu.be/wThcXx_vXjQ?si=vnMfs