Skip to main content

One doc tagged with "activation-functions"

View all tags

Activation Functions

Why we need non-linearity and a deep dive into Sigmoid, Tanh, ReLU, and Softmax.