Basics of Probability
An intuitive introduction to probability theory, sample spaces, events, and the fundamental axioms that govern uncertainty in Machine Learning.
An intuitive introduction to probability theory, sample spaces, events, and the fundamental axioms that govern uncertainty in Machine Learning.
A deep dive into Bayes' Theorem: the formula for updating probabilities based on new evidence, and its massive impact on Machine Learning.
Understanding the foundations of binary outcomes: The Bernoulli trial and the Binomial distribution, essential for classification models.
Understanding how the probability of an event changes given the occurrence of another event, and its role in predictive modeling.
Probabilistic clustering using Expectation-Maximization and the Normal distribution.
Understanding cross-entropy loss and why it is the gold standard for evaluating probability-based classifiers.
How to use trained Scikit-Learn estimators to generate point predictions and probability estimates.
A deep dive into Probability Mass Functions (PMF) for discrete data and Probability Density Functions (PDF) for continuous data.
Understanding the Poisson distribution: modeling the number of events occurring within a fixed interval of time or space.
Understanding Discrete and Continuous Random Variables, Probability Mass Functions (PMF), and Probability Density Functions (PDF).
A deep dive into the Normal Distribution, the Central Limit Theorem, and why Gaussian assumptions are the backbone of many Machine Learning algorithms.
Exploring the Discrete and Continuous Uniform distributions: the foundation of random sampling and model initialization.