Building Scalable APIs with FastAPI: Lessons from FPL Banter

January 15, 2025

8 min read

PythonFastAPIAPI DevelopmentPerformance

Building Scalable APIs with FastAPI: Lessons from FPL Banter

When I started building FPL Banter, a social platform for Fantasy Premier League managers, I knew performance would be critical. With thousands of users expecting real-time data from the official FPL API, I needed a solution that was fast, scalable, and developer-friendly.

Enter FastAPI - and it exceeded every expectation.

Why FastAPI?

Coming from a Django background, I was initially hesitant to switch frameworks. But FastAPI offered three killer features:

  1. Speed: Built on Starlette and Pydantic, it's one of the fastest Python frameworks available
  2. Automatic Documentation: Swagger UI and ReDoc out of the box
  3. Type Safety: Pydantic models catch errors before they reach production

The Challenge: Rate Limits & Data Freshness

The FPL API has strict rate limits. My challenge was:

  • Serve fresh data to thousands of users
  • Avoid hitting rate limits
  • Keep response times under 200ms

Solution: Intelligent Caching

from functools import lru_cache from datetime import datetime, timedelta class FPLCache: def __init__(self, ttl_seconds=300): # 5-minute cache self.cache = {} self.ttl = timedelta(seconds=ttl_seconds) def get(self, key): if key in self.cache: data, timestamp = self.cache[key] if datetime.now() - timestamp < self.ttl: return data return None def set(self, key, value): self.cache[key] = (value, datetime.now())

This simple caching strategy reduced our API calls by 95% while keeping data fresh.

Performance Optimizations

1. Async Everything

@app.get("/leagues/{league_id}/standings") async def get_standings(league_id: int): # Non-blocking database calls league = await db.leagues.find_one({"_id": league_id}) # Parallel API calls async with aiohttp.ClientSession() as session: tasks = [fetch_player_data(session, player_id) for player_id in league['members']] results = await asyncio.gather(*tasks) return process_standings(results)

Result: Response times dropped from 2s to 180ms

2. Background Tasks

For heavy operations, use FastAPI's background tasks:

from fastapi import BackgroundTasks @app.post("/leagues/{league_id}/refresh") async def refresh_league( league_id: int, background_tasks: BackgroundTasks ): background_tasks.add_task( update_league_data, league_id ) return {"status": "Refresh started"}

3. Pagination

Never return unbounded lists:

@app.get("/banter-stats") async def get_banter_stats( skip: int = 0, limit: int = 20 ): stats = await db.stats.find()\ .skip(skip)\ .limit(min(limit, 100)) # Cap at 100 return stats

Database Design with MongoDB

MongoDB's flexibility was perfect for rapidly evolving banter statistics:

# Schema-less banter stats { "user_id": "12345", "tournament_id": "fpl_banter", "stats": { "benched_beast": 156, "captain_catastrophe": 89, "differential_genius": 23 }, "updated_at": "2025-01-15T10:30:00Z" }

No migrations needed when adding new stat types!

Monitoring & Error Handling

from fastapi import HTTPException import logging logger = logging.getLogger(__name__) @app.get("/api/data") async def get_data(): try: data = await fetch_from_fpl() return data except RateLimitError: logger.warning("FPL rate limit hit") # Return cached data return get_cached_data() except Exception as e: logger.error(f"Unexpected error: {e}") raise HTTPException( status_code=500, detail="Internal server error" )

Results

After 6 months in production:

  • 99.9% uptime
  • <200ms average response time
  • Zero rate limit violations
  • Thousands of daily active users

Key Takeaways

  1. Use async/await everywhere - it's a game changer
  2. Cache aggressively - but keep it fresh
  3. Monitor everything - you can't improve what you don't measure
  4. Plan for failure - the FPL API will go down, be ready

What's Next?

I'm exploring:

  • WebSocket support for real-time updates
  • GraphQL layer for flexible querying
  • Redis for distributed caching

Have you built APIs with FastAPI? I'd love to hear about your experience! Connect with me on LinkedIn or check out the FPL Banter project.

Related Projects:

Let's Build Something Great

I'm open to freelance, full-time, or collaboration opportunities.