Stack: FastAPI · Docker Compose · fastapi-cache2 · Redis 7

No caching, each request goes to the database even if the data has not changed. With Redis, the first call checks the DB and stores the result. The subsequent requests return Redis directly, without touching the DB.
| Scenario | No Redis | With Redis |
|---|---|---|
| GET /books (First request) | ~80ms (Database) | ~80ms (Database) |
| GET /books (Second request) | ~80ms (Database) | ~2ms (Cache) |
| GET /categories x 100 | 8,000ms total | ~2ms total |
Add the Redis service and update your API to depend on it.
redis_bb: image: redis:7-alpine restart: unless-stopped container_name: redis_bb networks: - docker-network
Update your API service
bb_back_api: ... depends_on: mariadb: condition: service_healthy redis_bb: # ← add this condition: service_started
Start the containers
docker compose up -d
Verify that Redis is running:
docker ps | grep redis
2. Python Dependencies
Add to requirements.txt::
redis==5.2.1 fastapi-cache2==0.2.2
Rebuild the container:
docker compose build docker compose up -d
3. Configuration in main.py
Initialize the connection to Redis at startup:
from fastapi import FastAPI from fastapi.responses import ORJSONResponse from fastapi_cache import FastAPICache from fastapi_cache.backends.redis import RedisBackend from redis import asyncio as aioredis app = FastAPI( title="API", description="API REST", default_response_class=ORJSONResponse, ) @app.on_event("startup") async def startup(): redis = aioredis.from_url("redis://redis_bb:6379") FastAPICache.init(RedisBackend(redis), prefix="bb-cache")
blockquote class=”wp-block-quote is-layout-flow wp-block-quote-is-layout-flow”>
Note: The host name redis_qbb is the service name in docker-compose.yml, not localhost.
Ideal for data that changes little. The cache expires alone, without extra code.
from fastapi_cache.decorator import cache @app.get("/categorias") @cache(expire=300) # refreshes every 5 minutes async def get_categorias(): return await db.fetch_all("SELECT * FROM categorias")
Case B — Manual invalidation (more precise)
Ideal for data that changes and you need it to reflect immediately.
Mark the endpoint with a namespace:
@app.get("/books") @cache(expire=3600, namespace="books") # 1 hour, but can be cleared async def get_books(): return await db.fetch_all("SELECT * FROM books")
Clears cache when adding or editing a book:
from fastapi_cache import FastAPICache @app.post("/books") async def create_book(book: BookSchema): await db.execute(insert_query, book.dict()) # Invalids cache on modifying data await FastAPICache.clear(namespace="books") return {"status": "ok"}
5. What strategy to use?
| Situation | Recommended Strategy |
|---|---|
| Categories, genres, publishers | TTL long (1 hour+) |
| General books list | TTL short (1-5 min) or manual invalidation |
| Searches and filters | TTL short (30-60 sec) |
| User profile | No caching (personal data) |
| Real-time data | No caching |
6. Check that it works
Makes two requests to the same endpoint and compares response times in logs. The second should be notably faster.
You can also connect to Redis and view the saved keys:
docker exec -it redis_bb redis-cli > KEYS * > TTL bb-cache:libros
