Stop Hardcoding Everything: Master Dependency Injection in Python

Breaking Free from Fragile Code

Hardcoded logic is the silent killer of maintainable software. When you bake specific behaviors directly into a class, you create a rigid structure that resists change. If your

data pipeline only knows how to load from a CSV file because the pd.read_csv call is buried inside a method, you are stuck. The moment a requirement shifts—say, you need to pull from a SQL database or an S3 bucket—you have to perform surgery on the class itself. This violates the Open-Closed Principle and makes unit testing a nightmare. You cannot test the pipeline logic in isolation because the database connection or file system dependency is "baked in."

(DI) solves this by shifting the responsibility of creating dependencies from the object that uses them to the code that calls it. Instead of a class looking for its tools, you provide the tools upon initialization. This simple shift in perspective turns brittle, monolithic blocks of code into a collection of swappable, modular components.

Prerequisites and Toolkit

Stop Hardcoding Everything: Master Dependency Injection in Python
Stop Hardcoding Everything: Use Dependency Injection

To implement these patterns effectively, you should be comfortable with

3.10+ fundamentals, specifically classes and type hinting. Familiarity with functional programming concepts like first-class functions and closures will help when we move into manual injection techniques.

Key Libraries & Tools

  • Typing Module: Uses Callable, Protocol, and Any to define interfaces.
  • FastAPI: A modern web framework that includes a built-in
    Dependency Injection
    system.
  • Thesys C1: A generative UI API (featured sponsor) that demonstrates how external services are integrated into modern backends.

Refactoring to Manual Injection

We start by extracting hardcoded methods into standalone functions or objects. By passing these functions as arguments, we transform standard methods into higher-order functions.

from typing import Callable

def load_data_from_csv() -> list[dict]:
    return [{"name": "Arjan", "id": 1}]

class DataPipeline:
    def run(self, loader: Callable[[], list[dict]]):
        data = loader()
        print(f"Processing {data}")

# Usage
pipeline = DataPipeline()
pipeline.run(loader=load_data_from_csv)

While functional injection is elegant for simple scripts, a class-based approach using

offers more robust architectural guardrails.
Protocols
allow for structural subtyping—you define the shape of an object (e.g., it must have a .load() method) without requiring it to inherit from a specific base class. This keeps your pipeline decoupled from the concrete implementation of the loader.

from typing import Protocol

class Loader(Protocol):
    def load(self) -> list[dict]: ...

class CSVLoader:
    def load(self) -> list[dict]:
        return [{"data": "from_csv"}]

class DataPipeline:
    def __init__(self, loader: Loader):
        self.loader = loader

    def run(self):
        data = self.loader.load()
        # process data...

Building a Custom DI Container

In larger systems, manual wiring in the main() function becomes verbose. A

acts as a registry for your dependencies. It manages the lifecycle of objects, deciding whether to return a new instance or a cached singleton.

class Container:
    def __init__(self):
        self.providers = {}
        self.singletons = {}

    def register(self, name, provider, is_singleton=False):
        self.providers[name] = (provider, is_singleton)

    def resolve(self, name):
        if name in self.singletons:
            return self.singletons[name]
        
        provider, is_singleton = self.providers[name]
        instance = provider()
        
        if is_singleton:
            self.singletons[name] = instance
        return instance

# Wiring it up
container = Container()
container.register("loader", CSVLoader, is_singleton=True)
container.register("pipeline", lambda: DataPipeline(container.resolve("loader")))

pipeline = container.resolve("pipeline")
pipeline.run()

This container allows you to centralize your configuration. You could even swap providers based on environment variables or a JSON config file, allowing the application to change behavior without changing a single line of business logic code.

Syntax Notes and Conventions

is uniquely suited for DI because functions are first-class objects. You don't always need a heavy framework. Using lambda functions for delayed execution is a common pattern when a dependency requires runtime arguments (like a filename) that the container doesn't know about yet. Additionally, the use of typing.Protocol is preferred over abc.ABC because it promotes loose coupling; any class that happens to have the right method names satisfies the protocol.

Practical Examples and Frameworks

demonstrates the peak of DI utility. It uses a Depends() function to handle database sessions. This ensures that every route gets a fresh session that is automatically closed after the request, keeping the endpoint code clean and focused only on the logic of the API.

DI is also essential in

pipelines. You might want to swap an IncompleteDataTransformer for a StandardScaler during an experiment. By injecting these as components, you can run multiple versions of a pipeline simply by changing the injection script.

Tips and Gotchas

Avoid over-engineering. If you are writing a 50-line script, a

is overkill. Just pass the function. A common mistake is "Interface Bloat," where you define protocols for everything even when there is only ever one implementation. Only introduce abstraction when you actually need to swap the behavior—usually for testing or supporting different storage backends. Finally, remember that
Python
does not enforce type hints at runtime. If you inject the wrong object, it will only fail when the method is called, so back your DI architecture with a solid suite of unit tests.

5 min read