Introduction to Time Complexity
What is Time Complexity?β
Time complexity is a way to measure how the runtime of an algorithm changes as the size of the input changes. It helps us understand and compare the efficiency of different algorithms.
Big O Notationβ
Big O notation is a mathematical notation used to describe the upper bound of an algorithm's runtime. It provides a worst-case scenario for the time complexity of an algorithm.
Exampleβ
Q. Imagine a classroom of 100 students in which you gave your pen to one person. You have to find that pen without knowing to whom you gave it.
Here are some ways to find the pen and what the order is.
- : You go and ask the first person in the class if he has the pen. Also, you ask this person about the other 99 people in the classroom if they have that pen and so on, This is what we call .
- : Going and asking each student individually is .
- : Now I divide the class into two groups, then ask: βIs it on the left side, or the right side of the classroom?β Then I take that group and divide it into two and ask again, and so on. Repeat the process till you are left with one student who has your pen. This is what you mean by
- The searches if only one student knows on which student the pen is hidden.
- The if one student had the pen and only they knew it.
- The search if all the students knew, but would only tell me if I guessed the right side.
Use Casesβ
Input Length | Worst Accepted Time Complexity | Usually Type of Solutions |
---|---|---|
10-12 | O(N!) | Recursion and backtracking |
15-18 | O(2^N * N) | Recursion, backtracking, and bit manipulation |
18-22 | O(2^N * N) | Recursion, backtracking, and bit manipulation |
30-40 | O(2^(N/2) * N) | Meet in the middle, Divide and Conquer |
100 | O(N^4) | Dynamic programming, Constructive |
400 | O(N^3) | Dynamic programming, Constructive |
2K | O(N^2 * log N) | Dynamic programming, Binary Search, Sorting, Divide and Conquer |
10K | O(N^2) | Dynamic programming, Graph, Trees, Constructive |
1M | O(N * log N) | Sorting, Binary Search, Divide and Conquer |
100M | O(N), O(log N), O(1) | Constructive, Mathematical, Greedy Algorithms |
Common Time Complexitiesβ
a) O(1) - Constant Time Complexityβ
The runtime does not change with the input size.
def get_first_element(arr):
return arr[0]
The time it takes to get the first element is constant, regardless of the size of the array.
b) O(log n) - Logarithmic Time Complexityβ
The runtime grows logarithmically with the input size.
def binary_search(arr, target):
low = 0
high = len(arr) - 1
while low <= high:
mid = (low + high) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
low = mid + 1
else:
high = mid - 1
return -1
The time it takes to find the target grows logarithmically with the size of the array.
c) O(n) - Linear Time Complexityβ
The runtime grows linearly with the input size.
def print_elements(arr):
for element in arr:
print(element)
The time it takes to print all elements grows linearly with the size of the array.
d) O(n^2) - Quadratic Time Complexityβ
The runtime grows quadratically with the input size.
def bubble_sort(arr):
n = len(arr)
for i in range(n):
for j in range(0, n-i-1):
if arr[j] > arr[j+1]:
arr[j], arr[j+1] = arr[j+1], arr[j]
The time it takes to sort the array grows quadratically with the size of the array.
e) O(2^n) - Exponential Time Complexityβ
The runtime doubles with each additional element in the input.
def fibonacci(n):
if n <= 1:
return n
else:
return fibonacci(n-1) + fibonacci(n-2)
The time it takes to calculate the nth Fibonacci number doubles with each additional element.
f) O(n!) - Factorial Time Complexityβ
The runtime grows factorially with the input size.
def permutations(arr, l, r):
if l == r:
print(''.join(arr))
else:
for i in range(l, r+1):
arr[l], arr[i] = arr[i], arr[l]
permutations(arr, l+1, r)
arr[l], arr[i] = arr[i], arr[l]
The time it takes to generate all permutations of the array grows factorially with the size of the array.
How to Calculate Time Complexity?β
- Identify the basic operations in your algorithm (e.g., comparisons, assignments).
- Count the number of times these operations are executed in terms of the input size .
- Express this count using Big O notation to simplify and generalize the complexity.
Time Complexity of Different Data Structuresβ
-
Array:
- Access:
- Search:
- Insertion:
- Deletion:
-
Linked List:
- Access:
- Search:
- Insertion:
- Deletion:
-
Hash Table:
- Access:
- Search:
- Insertion:
- Deletion:
-
Binary Search Tree:
- Access:
- Search:
- Insertion:
- Deletion:
Analyzing Algorithms with Time Complexityβ
-
Linear Search:
def linear_search(arr, target):
for i in range(len(arr)):
if arr[i] == target:
return i
return -1Time Complexity:
-
Binary Search:
def binary_search(arr, target):
low = 0
high = len(arr) - 1
while low <= high:
mid = (low + high) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
low = mid + 1
else:
high = mid - 1
return -1Time Complexity:
-
Bubble Sort:
def bubble_sort(arr):
n = len(arr)
for i in range(n):
for j in range(0, n-i-1):
if arr[j] > arr[j+1]:
arr[j], arr[j+1] = arr[j+1], arr[j]Time Complexity:
-
Merge Sort:
def merge_sort(arr):
if len(arr) > 1:
mid = len(arr) // 2
left_half = arr[:mid]
right_half = arr[mid:]
merge_sort(left_half)
merge_sort(right_half)
i = j = k = 0
while i < len(left_half) and j < len(right_half):
if left_half[i] < right_half[j]:
arr[k] = left_half[i]
i += 1
else:
arr[k] = right_half[j]
j += 1
k += 1
while i < len(left_half):
arr[k] = left_half[i]
i += 1
k += 1
while j < len(right_half):
arr[k] = right_half[j]
j += 1
k += 1Time Complexity:
-
Fibonacci (Recursive):
def fibonacci(n):
if n <= 1:
return n
else:
return fibonacci(n-1) + fibonacci(n-2)Time Complexity:
Practical Tipsβ
- Optimize Your Code: Aim to reduce the time complexity for better performance, especially for large inputs.
- Consider Both Time and Space Complexity: While time complexity is important, space complexity (the amount of memory used) is also crucial.
- Know Your Problem Domain: Some problems inherently require higher time complexity. Understanding the problem domain helps in choosing the right algorithm.
Resources to Learn Moreβ
-
Books:
- Introduction to Algorithms by Cormen, Leiserson, Rivest, and Stein
- Algorithm Design Manual by Steven S. Skiena
-
Online Courses:
-
Websites:
-
Interactive Tools:
Understanding time complexity is essential for writing efficient algorithms and is a fundamental skill in computer science and software development. By analyzing and optimizing time complexity, you can ensure your programs run efficiently, even with large inputs.