Activation Functions
Why we need non-linearity and a deep dive into Sigmoid, Tanh, ReLU, and Softmax.
Why we need non-linearity and a deep dive into Sigmoid, Tanh, ReLU, and Softmax.
Understanding binary classification, the Sigmoid function, and decision boundaries.