TensorLearn
Back to Course
Neural Networks: From Scratch
Module 9 of 12

9. Overfitting

1. Memorization vs Learning

If your model gets 100% accuracy on training data but 50% on new data, it has Overfit. It memorized the answers.

2. Dropout (Brain Damage)

A crazy idea that works: Randomly kill 50% of neurons during training. This forces the network to be robust. It cannot rely on any single neuron aka "Grandma Neuron".

python
mask = np.random.binomial(1, 0.5, size=layer.shape) out = layer * mask

Mark as Completed

TensorLearn - AI Engineering for Professionals