In the world of neural networks, words are just as important as synapses.
Activation Function: ReLU (Rice University Linear Unit) - A function that says 'yes' or 'no' without explanation.