Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
History
Music
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts115/v4/11/6b/93/116b93b2-ab98-1e11-afde-b0e5deff3ac8/mza_484877653530635175.png/600x600bb.jpg
Deep Learning - Plain Version 2020 (QHD 1920)
Prof. Dr. Andreas Maier
65 episodes
9 months ago

 

Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition, and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
  • (multilayer) perceptron, backpropagation, fully connected neural networks

  • loss functions and optimization strategies

  • convolutional neural networks (CNNs)

  • activation functions

  • regularization strategies

  • common practices for training and evaluating neural networks

  • visualization of networks and results

  • common architectures, such as LeNet, Alexnet, VGG, GoogleNet

  • recurrent neural networks (RNN, TBPTT, LSTM, GRU)

  • deep reinforcement learning

  • unsupervised learning (autoencoder, RBM, DBM, VAE)

  • generative adversarial networks (GANs)

  • weakly supervised learning

  • applications of deep learning (segmentation, object detection, speech recognition, ...)

The accompanying exercises will provide a deeper understanding of the workings and architecture of neural networks.

 

Show more...
Education
RSS
All content for Deep Learning - Plain Version 2020 (QHD 1920) is the property of Prof. Dr. Andreas Maier and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.

 

Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition, and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
  • (multilayer) perceptron, backpropagation, fully connected neural networks

  • loss functions and optimization strategies

  • convolutional neural networks (CNNs)

  • activation functions

  • regularization strategies

  • common practices for training and evaluating neural networks

  • visualization of networks and results

  • common architectures, such as LeNet, Alexnet, VGG, GoogleNet

  • recurrent neural networks (RNN, TBPTT, LSTM, GRU)

  • deep reinforcement learning

  • unsupervised learning (autoencoder, RBM, DBM, VAE)

  • generative adversarial networks (GANs)

  • weakly supervised learning

  • applications of deep learning (segmentation, object detection, speech recognition, ...)

The accompanying exercises will provide a deeper understanding of the workings and architecture of neural networks.

 

Show more...
Education
https://is1-ssl.mzstatic.com/image/thumb/Podcasts115/v4/11/6b/93/116b93b2-ab98-1e11-afde-b0e5deff3ac8/mza_484877653530635175.png/600x600bb.jpg
49 - Deep Learning - Unsupervised Learning Part 4 2020
Deep Learning - Plain Version 2020 (QHD 1920)
9 minutes 5 seconds
5 years ago
49 - Deep Learning - Unsupervised Learning Part 4 2020
Deep Learning - Plain Version 2020 (QHD 1920)

 

Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition, and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
  • (multilayer) perceptron, backpropagation, fully connected neural networks

  • loss functions and optimization strategies

  • convolutional neural networks (CNNs)

  • activation functions

  • regularization strategies

  • common practices for training and evaluating neural networks

  • visualization of networks and results

  • common architectures, such as LeNet, Alexnet, VGG, GoogleNet

  • recurrent neural networks (RNN, TBPTT, LSTM, GRU)

  • deep reinforcement learning

  • unsupervised learning (autoencoder, RBM, DBM, VAE)

  • generative adversarial networks (GANs)

  • weakly supervised learning

  • applications of deep learning (segmentation, object detection, speech recognition, ...)

The accompanying exercises will provide a deeper understanding of the workings and architecture of neural networks.