Here, we're synthesizing the obvious connections within our neural network. It's a thing that happens. Read the latest on backpropagation and neural gossip to learn more.
Layer 1: Input Layer Layer 2: Hidden Layer 1 Layer 3: Output Layer Activation Function: Sigmoid
Why Bother?
Because, let's face it, you want more. More neurons, more connections, more backpropagation. We're synthesizing the obvious, and you're invited.
Deep Thought: Our Thoughts on Deep Learning