Competitive programming in Python

The very basics for LeetCode success using Python.
Author

Oleguer Canal

Published

August 1, 2024

“One LeetCode a day keeps unemployment away”

Someone wise

When I became jobless I decided that I’d do a couple of competitive programming problems a day to keep my brain from rotting 🧠.

1 Despite internet experts recommending around 27 hours/day (source)

2 I used to do them with C++ but now I wanna do them with Python.

I was initially randomly sampling problems without much thought. Until today, which I thought: “wait, what if I plan this minimally”? So decided to structure and update a bit my knowledge :)

Yes, yes. I know what you think: LeetCode is useless, it’s not real-life, it’s a hiring process red-flag blabla.

I think training logical thinking is important. And I have a lot of fun doing them (for me, it is analogous to people that do cross-words or similar): The fulfillment I get from completing them is usually greater than the one after watching a TV show episode.

Data structures

These are the main data structures needed along with their main O(1) operations.

List

Figure 1: List

They call it list, but it is implemented as a dynamic array. Meaning that the append operation presents constant amortized time: When needing to relocate it’ll take O(n) but it averages to constant.

This data structure is recommended for random-access and fixed-length operations.

Set & Map

Figure 2: Set

They are implemented as hash tables. Meaning that there is a collision risk and a worst-case scenaro of O(n) access.

Similar to dynamic arrays, the table starts with ~8 buckets and it dynamically doubles when needed to keep a load factor of around 23. When this happens, all elements are re-allocated, costing O(n). The idea is to balance the amount of collisions with the amount of allocated memory for buckets.

3 There is no need to re-compute the hashes, as they are cached tho.

Deque

Figure 3: Deque

It is implemented as a doubly-linked list. Its main strength are the constant-time appends and pops, at the cost of linear access costs.

It is useful to implement a stack (LIFO) and a queue (FIFO). Those are key for BFS, DFS algorithms.

from collections import deque

window = deque(maxlen=3)  # keep last 3 items
for x in [1, 2, 3, 4, 5]:
    window.append(x)
print(list(window)) # [3, 4, 5]

Heap

Figure 4: Heap

It is implemented as a min-heap tree: A binary tree whose parents are smaller than their children.

Heaps like to show themselves when searching stuff like Dijkstra’s algorithm and sorting arrays like in heap-sort.

import heapq

data = [7, 2, 5, 1, 9]
heapq.heapify(data)  # Linear-time, cte-space (Floyd's algo)
heapq.heappush(3)
print(heapq.heappop(data))  # 1

Algorithmic paradigms

Brute Force

Enumerate all possibilities until one satisfies the given condition. Simple to reason about, usually inefficient, and serves as a baseline or fallback when input sizes are small or constraints are tight enough to allow it.

Divide and Conquer / Recursion

Break the problem into smaller instances of the same problem, solve each recursively, and combine their results. It often yields logarithmic or n·log(n) behavior when the combination is efficient.

  • Merge sort / quicksort (sorting by splitting and merging or partitioning)
  • Binary search (recursively halving the search space)
  • Computing large exponentiation via exponentiation by squaring
  • Closest pair of points in plane (divide space, solve subproblems, merge)

Dynamic Programming / Memoization

Solve problems with overlapping subproblems by storing previously computed results to avoid redundant work; can be top-down (memoization) or bottom-up.

  • Fibonacci numbers with caching
  • Longest common subsequence / longest increasing subsequence
  • Knapsack problem (0/1 knapsack)
  • Edit distance between strings
  • Counting number of ways to climb stairs or unique grid paths

Backtracking / Branch and Bound

Explore solution space incrementally, reverting (“backtracking”) when a partial solution violates constraints. Branch and bound adds pruning by estimating bounds to skip entire subtrees that can’t beat current best.

  • N-queens placement
  • Sudoku solver
  • Generating all valid parentheses combinations
  • Constraint satisfaction (e.g., coloring a graph with limited colors)
  • Traveling Salesman with bounding to prune suboptimal tours

Graph/Tree Traversal

Systematic visiting of nodes in a graph or tree, foundational for exploring structure, connectivity, and enabling many higher-level algorithms.

  • Depth-first search for connected components or cycle detection
  • Breadth-first search for shortest path in unweighted graphs
  • Topological sort of a DAG
  • Checking if a graph is bipartite
  • Tree traversals (in-order, pre-order, post-order)

Common algorithms

Coming soon

Tips & tricks

Coming soon