MIT Compbio Lecture 12 - Deep Learning (Fall ’19)
1. Supervised Learning with Neural networks
- Perceptron, layers, activation units (sigmoid, softplus, ReLU)
- Learning: Gradient, Back-propagation, Rate, Dropout, Overfitting
2. Unsupervised learning with Deep belief networks & autoencoders
- Boltzmann machines, Restricted BMs (RBMs), Deep belief networks
- Learning: Energy, Gibbs Sampling, Simulated Annealing, Wake-sleep
3. Modern deep learning architectures
- Auto-encoders: Self-training, representation learning, RBM pre-training
- Convolutional neural n