Sets and Relations
Discrete Mathematics deals with distinct, separated values rather than continuous ranges. Set Theory is the language we use to group these values, and Relations describe how these groups interact. In Machine Learning, these concepts are vital for everything from defining probability spaces to building database schemas for training data.
1. Set Theory Fundamentalsβ
A Set is an unordered collection of distinct objects, called elements.
Notationβ
- : A set containing numbers 1, 2, and 3.
- : is an element of set .
- : An empty set.
- : Sets of Real numbers, Integers, and Natural numbers.
- (Real Numbers): Used for continuous features like height, price, or weight.
- (Integers): Used for count-based data (e.g., number of clicks).
- (Binary Set): The standard output set for binary classification.
- (Categorical Set): The labels for multi-class classification.
Key Operationsβ
The interaction between sets is often visualized using Venn Diagrams.
- Union (): Elements in , or , or both. (Equivalent to a logical
OR). - Intersection (): Elements present in both and . (Equivalent to a logical
AND). - Difference (): Elements in that are not in .
- Complement (): Everything in the universal set that is not in .
In classification tasks, the Label Space is a set. For a cat/dog classifier, the set of possible outputs is . When evaluating models, we often look at the Intersection of predicted labels and true labels to calculate accuracy.
2. Cartesian Productsβ
The Cartesian Product of two sets and , denoted , is the set of all possible ordered pairs .
If represents "Users" and represents "Movies," represents every possible interaction between every user and every movie. This is the foundation of Utility Matrices in Recommender Systems.
3. Relationsβ
A Relation from set to set is simply a subset of the Cartesian product . It defines a relationship between elements of the two sets.
Types of Relationsβ
In ML, we specifically look for certain properties in relations:
- Reflexive: Every element is related to itself.
- Symmetric: If is related to , then is related to (e.g., "Similarity" in clustering).
- Transitive: If and , then .
Binary Relations and Graphsβ
Relations are often represented as Directed Graphs. If , we draw an arrow from node to node .
4. Why this matters in Machine Learningβ
A. Data Preprocessingβ
When we perform "One-Hot Encoding" or handle categorical variables, we are mapping elements from a discrete set of categories into a numerical space.
B. Knowledge Graphsβ
Modern AI often uses Knowledge Graphs (like those powering Google Search). These are massive sets of entities connected by relations (e.g., (Paris, is_capital_of, France)).
C. Formal Logic in AIβ
Sets and relations form the basis of predicate logic, which is used in "Symbolic AI" and for defining constraints in optimization problems.
Now that we can group objects into sets and relate them, we need to understand the logic that allows us to make valid inferences from these groups.