Recurrent Neural Networks
Where the Past and Future are Always Present
Meet the Pioneers who brought us this revolutionary technology!
- Yann LeCun: The French Connection, who brought the French Connection.
- Yoshua Bengio: The Japanese Connection, who brought the Asian Fusion.
- Jeffrey Elman: The American Connection, who brought the Big Data.
How it works
We use a series of interconnected nodes, each with a set of neurons, to process input data.
The output of each node is fed into a set of weights and biases, which determine the strength of the connection.
Types of RNNs:
- Gated RNNs: The ones that gate, or regulate, the information flow.
- LSTMs: Theประก ones that remember the past,ประก present, and future.
- RRecurrent Units: The ones that repeat, repeat, REPEAT!