ai-coding
intermediate
Helper Modules: Clean Routes with the OpenAI API
min read
Frederick Tubiermont
Helper Modules: Clean Routes with the OpenAI API
As apps grow, routes get messy. The fix is simple: move logic into utils/ helper files and import what you need. Your routes stay readable; your helpers stay testable.
The Problem: Fat Routes
@app.route("/summarize", methods=["POST"])
def summarize():
text = request.form.get("text", "")
# 30 lines of OpenAI API logic directly in the route...
import openai
client = openai.OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))
try:
response = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "system", "content": "Summarize the following text concisely."},
{"role": "user", "content": text}
],
max_tokens=200
)
summary = response.choices[0].message.content
except openai.RateLimitError:
return "Rate limit exceeded", 429
except openai.APIError as e:
return f"API error: {e}", 500
return render_template("summary.html", summary=summary)
Hard to reuse, hard to test, hard to read.
The Solution: A Helper Module
# utils/ai.py
import os
import openai
# Initialize client once at module level
_client = None
def get_openai_client():
global _client
if _client is None:
api_key = os.environ.get("OPENAI_API_KEY")
if not api_key:
raise ValueError("OPENAI_API_KEY not set in environment")
_client = openai.OpenAI(api_key=api_key)
return _client
def summarize_text(text: str, max_words: int = 100) -> str:
"""
Summarize text using the OpenAI API.
Args:
text: The text to summarize
max_words: Approximate maximum words in the summary
Returns:
The summarized text as a string
Raises:
ValueError: If text is empty
RuntimeError: If the API call fails
"""
if not text or not text.strip():
raise ValueError("Text cannot be empty")
client = get_openai_client()
try:
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{
"role": "system",
"content": f"Summarize the following text in approximately {max_words} words. Be concise and accurate."
},
{
"role": "user",
"content": text
}
],
max_tokens=max_words * 2 # Rough token estimate
)
return response.choices[0].message.content.strip()
except openai.RateLimitError:
raise RuntimeError("OpenAI rate limit reached. Try again in a moment.")
except openai.AuthenticationError:
raise RuntimeError("Invalid OpenAI API key.")
except openai.APIError as e:
raise RuntimeError(f"OpenAI API error: {e}")
The Route is Now Thin and Readable
# routes/tools.py
from flask import request, render_template, jsonify
from utils.ai import summarize_text
@app.route("/summarize", methods=["POST"])
def summarize():
text = request.form.get("text", "").strip()
if not text:
return render_template("summarize.html", error="Please enter some text")
try:
summary = summarize_text(text, max_words=100)
return render_template("summarize.html", summary=summary, original=text)
except ValueError as e:
return render_template("summarize.html", error=str(e))
except RuntimeError as e:
return render_template("summarize.html", error=str(e))
Other Useful Helper Modules
The same pattern applies to any reusable logic:
# utils/email.py
def send_welcome_email(to_email: str, username: str) -> bool:
"""Send welcome email. Returns True on success."""
pass
# utils/storage.py
def upload_file(file, folder: str) -> str:
"""Save file to disk. Returns the file path."""
pass
# utils/formatting.py
def truncate(text: str, length: int = 100) -> str:
"""Truncate text with ellipsis."""
return text[:length] + "..." if len(text) > length else text
Installation
pip install openai
Add to requirements.txt:
openai
Add to .env:
OPENAI_API_KEY=sk-proj-your-key-here
Multiple AI Helpers in the Same Module
# utils/ai.py
def summarize_text(text: str, max_words: int = 100) -> str:
"""Summarize text."""
# ... implementation above
def classify_sentiment(text: str) -> str:
"""Returns 'positive', 'negative', or 'neutral'."""
client = get_openai_client()
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": "Classify the sentiment as: positive, negative, or neutral. Reply with one word only."},
{"role": "user", "content": text}
],
max_tokens=10
)
return response.choices[0].message.content.strip().lower()
def extract_keywords(text: str, count: int = 5) -> list[str]:
"""Extract the top N keywords from text."""
client = get_openai_client()
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": f"Extract exactly {count} keywords from this text. Return them as a comma-separated list, nothing else."},
{"role": "user", "content": text}
],
max_tokens=50
)
raw = response.choices[0].message.content.strip()
return [kw.strip() for kw in raw.split(",")]
The Rule
One responsibility per file. If utils/ai.py grows beyond ~150 lines, split it into utils/ai/summarize.py, utils/ai/classify.py, etc.
Routes call helpers. Helpers do work. Never the other way around.
Was this helpful?
Get More Flask Vibe Tutorials
Join 1,000+ developers getting weekly Flask tips and AI-friendly code patterns.
No spam. Unsubscribe anytime.