Understanding FastAPI middleware
Supercharge Your FastAPI with Middleware: Practical Use Cases and Examples
Middleware sits between an API router its routes, acting as a layer where you can runcode before and after a request is handled. In this article we'll explore two key use cases of middleware in FastAPI, demonstrating both how it works and why it's useful. Let's code!
A. Setting up
To begin, let's create a simple API that serves as a base for our middleware examples. The app below has only one route: test
which simulates actual work by sleeping for a few milliseconds before returning "OK".
import random
import time
from fastapi import FastAPI
app = FastAPI(title="My API")
@app.get('/test')
def test_route() -> str:
sleep_seconds:float = random.randrange(10, 100) / 100
time.sleep(sleep_seconds)
return "OK"
What is middleware?
Middleware acts as a filter between the incoming HTTP request and the processing done by your application. Think of it like airport security: every passenger must go through security before and after boarding the plane. Similarly, every API request passes through middleware: both before being handled and after the response is created.
To illustrate, we'll create middleware that:
- Measures how long a request takes to process
- Adds a unique ID to the request's state
Minimal middleware
In FastAPI, we add middleware to our app using the @app.middleware
decorator. Below is the simples implementation:
@app.middleware("http")
async def add_middleware(request: Request, call_next:Callable):
return await call_next(request)
Here's what happens:
- FastAPI listens to ASGI 'http' events
- The
request
object and acall_next
callable are injected into the middleware function - The
call_next
function represents the next step in the request-handling pipeline, which could be another middleware or the actual route handler
Currently, the middleware simply passes the request along to the next step. In the next part, however, we'll see that we can execute code before and after calling call_next
to add functionality.
B. Timer (before and after request)
We can use middleware to measure how long a request takes to process. Here's how:
@app.middleware("http")
async def time_request(request: Request, call_next:Callable):
# Start timing
strt = time.perf_counter()
# Handle request
response:StreamingResponse = await call_next(request)
# Finish timing request
duration_ms = (time.perf_counter() - strt) * 1000
print(f"RESP {response.status_code} ({duration_ms:.0f}ms) @{request.url.path} {request.url.query}")
# Parse headers and return response
response.headers["X-Process-Time"] = str(duration_ms * 1000)
return response
How it works:
- Start timing using
time.perf_counter
before callingcall_next
- Handle request: the middleware calls
call_next
, allowing the request to proceed to the next step in the pipeline - Finish timing: after
call_next
, we measure and log the elapsed time - Modify response: Add timing information to the response headers to make the duration accessible to clients.
Use cases
- Security: assert the request contains the right headers to allow access to a route, block requets from suspicous IP's
- Rate Limiting: prevent abuse of APIs by controlling the number of requests per IP-address
- Content compression: gzip API responses e.g.
- Logging: capture all requests and responses for debugging, record user activity or suspicious patterns
- Metrics: collect data on request count, error rates, payload sizes etc.
C. Request ID (adding data to request state)
In the previous part we've seen that we can execute code before our request is handled. Next we'll use the same technique to enrich the request's state by adding custom data. One example is to add a unique request ID, which allows you to track and relate function calls throughout the request life cycle.
Adding data to the request state
Adding data to the state of the request is super easy, we just store the data in the state of our request:
@app.middleware("http")
async def add_correlation_id(request: Request, call_next:Callable) -> StreamingResponse:
request.state.correlation_id = uuid.uuid4().hex
return await call_next(request)
Using request state data from in our route
Once the request contains a correlation_id
, you can access it in your route handler. Let's update our route in order to extract the request:
@app.get('/test')
def test_route(request: Request) -> str:
print(request.state.correlation_id)
===rest of the function===
This works because the middleware generates a unique correlation ID and stores it in the request's state. Then, in the route, the request
object is injected, allowing you to access the stored correlation_id
.
Use cases:
- Authentication and authorization: attach user information (user ID or role) to the request.
- Localization: add language or region preferences to the request for rendering responses in the appropriate language e.g.
- Caching: attach frequently accessed data such as configuration settings.
- Dependency Injection: provide utilities like loggers or API clients.
- A/B testing: Add feature flags to enabled or disabled specific functionalities.
Conclusion
Middleware is a powerful tool for enhancing your FastAPI app. It provides a convenient way to handle cross-cutting concerns like timing., logging and data enrichment. BY using the examples and use cases in this app, I hope you are better equipped to build more robust and feature-richt API'S.
I hope this article was as clear as I intended it to be but if this is not the case please let me know what I can do to clarify further. In the meantime, check out my other articles on all kinds of programming-related topics.
Happy coding!
— Mike
P.s: like what I'm doing? Follow me!