Back to Course
Deep Learning with PyTorch
Module 7 of 12
7. Interpreting Sequences (RNNs)
1. Memory
Standard Networks have amnesia. They see one input at a time. RNNs (Recurrent Neural Networks) have a "hidden state" (memory) passed from step to step.
2. LSTM (Long Short-Term Memory)
Standard RNNs forget quickly (Vanishing Gradient). LSTMs possess gates (Forget, Input, Output) to control the flow of time.