Overfitting is like being on a treadmill, but instead of running, you're just staring at the same spot on the wall.
Early stopping, on the other hand, is like getting off the treadmill because you're convinced you've already achieved your New Year's resolutions.
Regularization: When You're Not Even Close might help you learn to let go.
In a world where neural networks are the only thing that matters, overfitting is the ultimate sin, and early stopping is the ultimate cop-out.
But hey, at least you're not stuck in an infinite loop of Neural Loops