Sprint 2 - Algorithmic Efficiency (Python)
Lesson for the Algorithmic Effiency in Python
- Algorithmic efficiency refers to how well an algorithm uses computational resources, primarily time and memory. Writing efficient code is crucial for handling large datasets and creating responsive applications.
- Example 1: O(1) – Constant Time
- YOUR TURN: Test O(1) Complexity
- Example 2: O(n) – Linear Time
- 🎯 YOUR TURN: Practice with find_max
- ⚙️ INTERACTIVE: Generate Random Lists
- Example 3: O(n²) – Quadratic Time
- ⚙️ Optimization: O(n) vs O(n²)
- # 🧠 Memory Complexity in Python
Algorithmic efficiency refers to how well an algorithm uses computational resources, primarily time and memory. Writing efficient code is crucial for handling large datasets and creating responsive applications.
Big O Notation
Big O notation describes how an algorithm's runtime or space requirements grow as the input size increases. It focuses on the worst-case scenario.
- O(1) – Constant time
- O(log n) – Logarithmic time
- O(n) – Linear time
- O(n log n) – Linearithmic time
- O(n²) – Quadratic time
- O(2ⁿ) – Exponential time
Example 1: O(1) – Constant Time
Operations that take the same amount of time regardless of the input size.
def get_first_element(my_list):
"""O(1) - Always takes the same time"""
if len(my_list) > 0:
return my_list[0]
return None
# Test it
numbers = [1, 2, 3, 4, 5]
print(get_first_element(numbers))
large_list = list(range(1000000))
print(get_first_element(large_list))
YOUR TURN: Test O(1) Complexity
# Create your own lists of different sizes and test the function
# Does it take longer with bigger lists?
my_small_list = [] # Add some numbers here
my_large_list = [] # Add lots of numbers here
# Uncomment and test:
# print(get_first_element(my_small_list))
# print(get_first_element(my_large_list))
Example 2: O(n) – Linear Time
The time grows proportionally with the input size.
def find_max(numbers):
"""O(n) - Must check every element"""
if not numbers:
return None
max_val = numbers[0]
for num in numbers:
if num > max_val:
max_val = num
return max_val
# Test it
test_list = [3, 7, 2, 9, 1, 5]
print(f"Max value: {find_max(test_list)}")
🎯 YOUR TURN: Practice with find_max
# Create 3 different lists and find their maximum values
list_1 = [] # Add your numbers
list_2 = [] # Add your numbers
list_3 = [] # Add your numbers
# Test your lists:
# print(f"Max of list 1: {find_max(list_1)}")
# print(f"Max of list 2: {find_max(list_2)}")
# print(f"Max of list 3: {find_max(list_3)}")
⚙️ INTERACTIVE: Generate Random Lists
import random
# Modify these parameters to create different test lists
list_size = int(input("Enter list size: "))
min_value = int(input("Enter minimum value: "))
max_value = int(input("Enter maximum value: "))
random_list = [random.randint(min_value, max_value) for _ in range(list_size)]
print(f"Generated list: {random_list[:10]}...") # Show first 10
print(f"Maximum value: {find_max(random_list)}")
O(nlogn) is usually for sorting lists:
nums = [1, 2, 3, 4, 5]
nums.sort() # O(n log n)
print(nums)
Example 3: O(n²) – Quadratic Time
Nested loops often result in quadratic time complexity.
def find_duplicates_slow(numbers):
"""O(n²) - Nested loop checks every pair"""
duplicates = []
for i in range(len(numbers)):
for j in range(i + 1, len(numbers)):
if numbers[i] == numbers[j] and numbers[i] not in duplicates:
duplicates.append(numbers[i])
return duplicates
# Test it
test_list = [1, 2, 3, 2, 4, 5, 3, 6]
print(f"Duplicates (slow): {find_duplicates_slow(test_list)}")
⚙️ Optimization: O(n) vs O(n²)
Let's optimize the duplicate finder using a set (hash table).
def find_duplicates_fast(numbers):
"""O(n) - Single pass with set"""
seen = set()
duplicates = set()
for num in numbers:
if num in seen:
duplicates.add(num)
else:
seen.add(num)
return list(duplicates)
# Test both versions
test_list = [1, 2, 3, 2, 4, 5, 3, 6]
print(f"Duplicates (slow): {find_duplicates_slow(test_list)}")
print(f"Duplicates (fast): {find_duplicates_fast(test_list)}")
# 🧠 Memory Complexity in Python
Now, we’ll explore how memory usage scales with data structures and algorithms in Python.
You’ll learn:
- What memory complexity means
- How to estimate it using Big-O notation
- How Python’s data structures affect memory
- How to measure memory usage practically
- Through mini exercises and examples 🚀
📘 What is Memory Complexity?
Memory complexity (or space complexity) measures how much extra memory an algorithm uses as the input size increases.
For example:
- An algorithm that uses a single variable → O(1) memory
- An algorithm that stores all elements in a list → O(n) memory
We’ll compare examples and see how this works in practice.
# Example 1: O(1) Memory
def constant_space_sum(n):
total = 0
for i in range(1, n + 1):
total += i
return total
print(constant_space_sum(10))
55
Explanation:
The function constant_space_sum only uses a few variables (total and i), no matter how large n is.
Thus, it uses O(1) additional memory.
Now, let’s look at a function that grows in memory with the input size.
# Example 2: O(n) Memory
def list_sum(n):
nums = [i for i in range(1, n + 1)] # creates a list of size n
return sum(nums)
print(list_sum(10))
Explanation:
Here, we store all numbers in a list, which grows linearly with n.
This is O(n) memory complexity.
Try changing n to 1_000_000 and see if you notice a delay or increased memory use.
O(n!) Run-Time Complexity
import itertools
def factorial_time(n):
perms = list(itertools.permutations(range(n)))
print(f"Total permutations: {len(perms)}")
factorial_time(5)
O(2^n)
def exponential_time(n):
if n == 0 or n == 1:
return 1
return exponential_time(n - 1) + exponential_time(n - 2)
print(exponential_time(10))
O(nlogn)
def linearithmic_time(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = linearithmic_time(arr[:mid])
right = linearithmic_time(arr[mid:])
return merge(left, right)
def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] < right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result
print(linearithmic_time([5, 3, 8, 4, 2, 7, 1, 6]))
O(n^2)
def quadratic_time(n):
for i in range(n):
for j in range(n):
print(i, j)
quadratic_time(5)
# O(1)
🧩 Mini Exercise 1
Write a function that returns the squares of numbers from 1 to n.
Try two versions:
- Using a list (higher memory usage)
- Using a generator (lower memory usage) (Hint - Generators use the yeild keyword when interating rather than printing, call me over for help if needed)
Compare their behavior.
💡 Key Insight:
- The list version stores all values at once in memory.
- The generator version produces values one at a time — much more memory efficient.
👉 Use generators when working with large datasets.
✅ Observation: Different data structures have different memory footprints.
- Lists are generally larger due to dynamic resizing.
- Tuples are smaller and immutable.
- Sets have overhead for hashing.
# ⚡ Python Data Structures — Complexities & Examples
print("List (Dynamic Array) : Access O(1), Search O(n), Append O(1)*, Memory O(n)")
my_list = [1,2,3]; my_list.append(4)
print("Tuple (Immutable Sequence) : Access O(1), Search O(n), Memory O(n)")
my_tuple = (1,2,3); _ = my_tuple[0]
print("Dict (Hash Map) : Access O(1), Insert O(1), Delete O(1), Memory O(n)")
my_dict = {"a":1,"b":2}; my_dict["c"]=3
print("Set (Unique Collection) : Add/Search/Remove O(1), Memory O(n)")
my_set = {1,2,3}; my_set.add(4)
print("String (Immutable) : Access O(1), Concatenation O(n), Memory O(n)")
text = "hi"; text += " there"
print("Deque (Queue) : Append/Pop O(1), Search O(n), Memory O(n)")
from collections import deque
queue = deque([1,2,3]); queue.append(4); queue.popleft()
print("Stack (List) : Push/Pop O(1), Memory O(n)")
stack = []; stack.append(1); stack.pop()
print("Heap (Priority Queue) : Insert O(log n), Extract O(log n), Memory O(n)")
import heapq
heap = [3,1,4]; heapq.heapify(heap); heapq.heappush(heap,0)
List (Dynamic Array) : Access O(1), Search O(n), Append O(1)*, Memory O(n)
Tuple (Immutable Sequence) : Access O(1), Search O(n), Memory O(n)
Dict (Hash Map) : Access O(1), Insert O(1), Delete O(1), Memory O(n)
Set (Unique Collection) : Add/Search/Remove O(1), Memory O(n)
String (Immutable) : Access O(1), Concatenation O(n), Memory O(n)
Deque (Queue) : Append/Pop O(1), Search O(n), Memory O(n)
Stack (List) : Push/Pop O(1), Memory O(n)
Heap (Priority Queue) : Insert O(log n), Extract O(log n), Memory O(n)