Elevating Python with 7 Functional Programming Techniques
Python sits comfortably in the multi-paradigm camp. While it’s famously object-oriented, it borrows heavily from functional programming. Embracing these functional patterns isn't just about writing shorter code; it’s about making your data transformations predictable and your logic easier to test. Let’s break down how to apply these concepts effectively.
Recursion and Structural Pattern Matching
To make recursive logic cleaner, use Structural Pattern Matching (introduced in if-else blocks to check for empty lists or single elements, you use match and case to match the shape of your data:

def quicksort(data: list[int]) -> list[int]:
match data:
case []:
return []
case [pivot, *rest]:
left = quicksort([x for x in rest if x <= pivot])
right = quicksort([x for x in rest if x > pivot])
return left + [pivot] + right
Immutability and Pure Functions
In functional programming, data is immutable. You don't change a list; you create a new one. This eliminates side effects where a function accidentally modifies a variable used elsewhere. A Pure Function always returns the same output for the same input and never touches the outside world (like global variables or console output).
When writing functions like bubble_sort, avoid in-place modification. Create a copy of the input data first using data.copy() or list slicing data[:]. This ensures that the original dataset remains untouched, making your code thread-safe and much easier to debug.
Higher-Order Functions and Partial Application
A higher-order function either takes a function as an argument or returns one. You likely already use map or filter, but the partial. This allows you to "pre-fill" arguments of a function to create a new, specialized function.
from functools import partial
def multiply(x: int, y: int) -> int:
return x * y
double = partial(multiply, 2)
print(double(10)) # Returns 20
Function Composition and Lazy Evaluation
When you have multiple transformations, nesting them—sort(add_ten(multiply_by_two(data)))—becomes unreadable. Function Composition allows you to combine these into a single pipeline. By using reduce from functools, you can apply a sequence of functions to a data point systematically.
Finally, use Lazy Evaluation to handle large datasets. Instead of calculating every value immediately and storing it in memory, use Generators and the yield keyword. This defers the computation until the exact moment the value is needed, which is a massive performance win for heavy data processing.
from typing import Iterator
def lazy_multiplier(data: Iterator[int]) -> Iterator[int]:
for item in data:
yield item * 2
Syntax Notes and Best Practices
When working with functional Python, lean on the typing module. Use Callable to define function signatures and Iterator for lazy sequences. Always group your side effects—like print() or database writes—together at the edges of your program (like the main block). This keeps the core logic "pure" and your software architecture robust.

Fancy watching it?
Watch the full video and context