Skip to main content

One doc tagged with "relu"

View all tags

Activation Functions

Why we need non-linearity and a deep dive into Sigmoid, Tanh, ReLU, and Softmax.