✅ Popcorn Hack – The Even/Odd Check Challenge

Best Two Strategies:

  1. Use the modulus operator (%) to check if the remainder when divided by 2 is 0
  2. Check if the last digit is 0, 2, 4, 6, or 8 manually

Explanation:
These methods are the most efficient because they have O(1) time complexity — they execute in constant time regardless of the input size. The modulus operator gives a direct mathematical check, and checking the last digit works because even numbers always end in even digits.

🚀 Algorithm Efficiency – Full Notes

💡 Why Efficiency Matters

Efficient algorithms aren’t just nice-to-haves—they’re essential.

  • Performance & UX: Faster apps feel smoother. Like typing in a search bar and getting instant results.
  • Resource Constraints: Mobile devices and servers need to conserve memory, CPU, and energy.
  • Scalability: Algorithms must scale with data. What works for 10 elements might lag with 10,000.
  • Real-World Impact: Streaming services, real-time systems, and financial data all depend on efficient computation.

Efficiency Comparison Table:

Aspect Benefit Example
Performance Faster execution Instant search
Resource Usage Lower memory/CPU Mobile apps
Scalability Handles large datasets Big data pipelines
Real-World Impact Cost savings, better UX Streaming platforms

⚙️ Understanding Algorithmic Efficiency

Algorithmic efficiency measures how well an algorithm uses time, space, and sometimes energy as input size grows.

  • Time Efficiency: Number of operations performed.
  • Space Efficiency: Extra memory needed.
  • Energy Efficiency: Less computation = longer battery life (important for mobile!).

Analogy: Choosing between a toll road (fast, expensive) and a free road (slow, cheap) mirrors algorithm trade-offs.


📈 Big O Notation – The Heart of Algorithm Analysis

Big O describes how runtime or memory grows with input size.

🔍 Key Concepts

  • Worst-Case Focus: Prepares for the most demanding scenario.
  • Constants Don’t Matter: O(n + 5) simplifies to O(n).
  • Comparison Tool: Compares algorithms regardless of hardware/language.

🧠 Common Time Complexities

Big O Description Example
O(1) Constant time Array lookup
O(log n) Logarithmic time Binary search
O(n) Linear time Loop through list
O(n log n) Linearithmic time Merge sort
O(n²) Quadratic time Bubble sort

🍿 Popcorn Hack – Even/Odd Check Challenge

Task: Check if a number is even or odd. Which TWO methods are most efficient?

Options:

  1. Divide by 2 and check if the result is whole.
  2. Check last digit manually.
  3. Use % (modulo) to check num % 2 == 0.
  4. Convert to string and check last char.
  5. Subtract 2 until 0 or 1.
  6. Generate all even numbers and check membership.

Best Options:

  • Use modulus (%)
  • Divide by 2 and check remainder
    ✅ These are both O(1) operations – constant time.

✍️ Mini Challenge: Explain in two sentences why your two picks are efficient.


🔁 String Reversal: Speed vs. Memory

Method 1: Speed-Optimized

def speed_optimized_method(s):
    return s[::-1]

Method 2: Memory-Optimized

def memory_optimized_method(s):
    r = []
    for c in s:
        r.insert(0, c)
    return ''.join(r)

Benchmarking Code

import time, random, string, matplotlib.pyplot as plt, tracemalloc

def measure_time_and_memory(func, s):
    tracemalloc.start()
    start = time.perf_counter()
    func(s)
    end = time.perf_counter()
    current, peak = tracemalloc.get_traced_memory()
    tracemalloc.stop()
    return end - start, peak / (1024*1024)

🧠 Takeaway:

  • Speed-Optimized: O(n) time, high memory (creates copy).
  • Memory-Optimized: Saves space, but slower due to insertions.

🔍 Searching Algorithms

Linear Search – O(n)

def linear_search(arr, target):
    for i in range(len(arr)):
        if arr[i] == target:
            return i
    return -1

Binary Search – O(log n)

def binary_search(arr, target):
    left, right = 0, len(arr) - 1
    while left <= right:
        mid = (left + right) // 2
        if arr[mid] == target:
            return mid
        elif arr[mid] < target:
            left = mid + 1
        else:
            right = mid - 1
    return -1

🔃 Sorting Algorithms – A Comparative Analysis

Bubble Sort – O(n²)

def bubble_sort(arr):
    for i in range(len(arr)):
        for j in range(0, len(arr) - i - 1):
            if arr[j] > arr[j + 1]:
                arr[j], arr[j + 1] = arr[j + 1], arr[j]

Merge Sort – O(n log n)

def merge_sort(arr):
    if len(arr) > 1:
        mid = len(arr) // 2
        left = arr[:mid]
        right = arr[mid:]
        merge_sort(left)
        merge_sort(right)

        i = j = k = 0
        while i < len(left) and j < len(right):
            if left[i] < right[j]:
                arr[k] = left[i]
                i += 1
            else:
                arr[k] = right[j]
                j += 1
            k += 1
        while i < len(left):
            arr[k] = left[i]
            i += 1
            k += 1
        while j < len(right):
            arr[k] = right[j]
            j += 1
            k += 1

🧠 Why Merge Sort > Bubble Sort:
Merge sort reduces redundant comparisons by sorting sublists and merging them. Bubble sort checks every adjacent pair repeatedly.


🧪 Diagram: Algorithm Efficiency Overview

+----------------------+       +------------------------+
|  Input Data Size     | --->  |   Algorithm Efficiency |
+----------------------+       +------------------------+
          |                                |
          v                                v
    [Time Complexity]              [Space Complexity]
          |                                |
          v                                v
   e.g., O(n), O(log n)             e.g., O(1), O(n)

Use this diagram as a mental map when evaluating or designing algorithms.

Popcorn hack 2

import time
import random

# Generate a large sorted list
data_size = 20000000
sorted_data = sorted(random.sample(range(100000000), data_size))

# Target to find (worst case for linear search)
target = sorted_data[-1]  # Last element

# O(n) - Linear Search
def linear_search(arr, target):
    for i, element in enumerate(arr):
        if element == target:
            return i
    return -1

# O(log n) - Binary Search
def binary_search(arr, target):
    left, right = 0, len(arr) - 1
    
    while left <= right:
        mid = (left + right) // 2
        if arr[mid] == target:
            return mid
        elif arr[mid] < target:
            left = mid + 1
        else:
            right = mid - 1
    
    return -1

# Compare performance
print("Testing with data size:", data_size)

start = time.time()
linear_result = linear_search(sorted_data, target)
linear_time = time.time() - start
print(f"Linear search: {linear_time:.6f} seconds")

start = time.time()
binary_result = binary_search(sorted_data, target)
binary_time = time.time() - start
print(f"Binary search: {binary_time:.6f} seconds")

print(f"Binary search is approximately {linear_time/binary_time:.0f}x faster")
Testing with data size: 20000000
Linear search: 3.420149 seconds
Binary search: 0.000094 seconds
Binary search is approximately 36409x faster

Homework hack 1

Explanation

In this homework, we’re comparing the efficiency of two sorting algorithms: Bubble Sort and Merge Sort.

Bubble Sort

Bubble Sort is a simple algorithm where each element is compared with its neighbor, and if they are in the wrong order, they are swapped. This process repeats until the entire list is sorted. The time complexity of Bubble Sort is O(n²), where n is the number of elements in the list. As a result, it becomes inefficient for larger datasets because it performs a lot of unnecessary comparisons.

Merge Sort

Merge Sort is a divide-and-conquer algorithm. It divides the list into two halves, recursively sorts each half, and then merges the sorted halves back together. The time complexity of Merge Sort is O(n log n), which makes it much more efficient than Bubble Sort for larger datasets. The recursive nature of Merge Sort allows it to perform fewer comparisons by breaking the problem into smaller parts.

Comparison of Performance

In the code:

  1. We generate a random list of 100 integers between 1 and 1000.
  2. We time how long each algorithm takes to sort the list using time.time().
  3. After sorting, we output the time taken by each algorithm and compare them:
    • Bubble Sort is slower because of its O(n²) time complexity.
    • Merge Sort is faster due to its O(n log n) time complexity.

Why Merge Sort is Faster than Bubble Sort

  • Merge Sort consistently outperforms Bubble Sort because it reduces the number of comparisons and swaps needed to sort a list. The O(n log n) time complexity means that as the size of the dataset grows, the increase in time is much slower compared to the O(n²) complexity of Bubble Sort.
  • Bubble Sort, on the other hand, performs many unnecessary comparisons, making it inefficient for larger lists.

By running this code, you’ll see that Merge Sort is faster, especially with larger datasets, due to its more efficient algorithmic design.

import time
import random

# Bubble Sort Algorithm
def bubble_sort(arr):
    n = len(arr)
    for i in range(n):
        for j in range(0, n-i-1):
            if arr[j] > arr[j+1]:
                arr[j], arr[j+1] = arr[j+1], arr[j]

# Merge Sort Algorithm
def merge_sort(arr):
    if len(arr) > 1:
        mid = len(arr) // 2
        left = arr[:mid]
        right = arr[mid:]

        merge_sort(left)
        merge_sort(right)

        i = j = k = 0
        while i < len(left) and j < len(right):
            if left[i] < right[j]:
                arr[k] = left[i]
                i += 1
            else:
                arr[k] = right[j]
                j += 1
            k += 1
        while i < len(left):
            arr[k] = left[i]
            i += 1
            k += 1
        while j < len(right):
            arr[k] = right[j]
            j += 1
            k += 1

# Generate a random list of numbers
arr = [random.randint(1, 1000) for _ in range(100)]

# Time Bubble Sort
start = time.time()
bubble_sort(arr.copy())  # Using copy to avoid modifying original list
bubble_sort_time = time.time() - start

# Time Merge Sort
start = time.time()
merge_sort(arr.copy())  # Using copy to avoid modifying original list
merge_sort_time = time.time() - start

# Output the results
print(f"Bubble Sort took {bubble_sort_time:.6f} seconds.")
print(f"Merge Sort took {merge_sort_time:.6f} seconds.")

# Compare and output which is faster
if bubble_sort_time < merge_sort_time:
    print("Bubble Sort is faster.")
else:
    print("Merge Sort is faster.")
Bubble Sort took 0.000436 seconds.
Merge Sort took 0.000191 seconds.
Merge Sort is faster.

Homework hack 2

import random

# Generate a sorted list of 100,000 numbers
sorted_list = list(range(1, 100001))

# Pick a random target number from the list
target = random.choice(sorted_list)

# Linear Search
def linear_search(arr, target):
    count = 0
    for i in range(len(arr)):
        count += 1
        if arr[i] == target:
            return count  # Return the number of comparisons made
    return -1

# Binary Search
def binary_search(arr, target):
    left, right = 0, len(arr) - 1
    count = 0
    while left <= right:
        count += 1
        mid = (left + right) // 2
        if arr[mid] == target:
            return count  # Return the number of comparisons made
        elif arr[mid] < target:
            left = mid + 1
        else:
            right = mid - 1
    return -1

# Perform searches and print results
linear_comparisons = linear_search(sorted_list, target)
binary_comparisons = binary_search(sorted_list, target)

print(f"Linear search made {linear_comparisons} comparisons.")
print(f"Binary search made {binary_comparisons} comparisons.")
Linear search made 29887 comparisons.
Binary search made 17 comparisons.

Homework Hack #2: Search Race – Code Edition

Explantion:

  • Linear Search checks each element one-by-one, resulting in a time complexity of O(n), meaning it takes longer as the list size grows. In contrast, Binary Search divides the list in half each time, achieving O(log n) time complexity, making it much faster for large datasets, especially when the list is sorted.