FastAPI-based backend providing RESTful API for task management operations. This is the core business logic layer that handles data persistence, validation, and task operations.
- FastAPI Framework: Modern, fast web framework with automatic API documentation
- SQLAlchemy ORM: Database abstraction with SQLite for development
- Task CRUD Operations: Complete create, read, update, delete functionality
- Bulk Operations: Update multiple tasks simultaneously
- Analytics API: Task metrics and performance insights
- Data Validation: Pydantic models for request/response validation
- Auto Documentation: OpenAPI/Swagger documentation at
/docs
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ MCP Server │───▶│ Backend API │───▶│ Database │
│ (Port stdio) │ │ (Port 8001) │ │ (SQLite) │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│ │ │
MCP Protocol HTTP REST API File Storage
- Create virtual environment:
python -m venv venv
source venv/bin/activate # Linux/Mac
# or
venv\Scripts\activate # Windows- Install dependencies:
pip install -r requirements.txt- Run the server:
python main.pyThe server will start on http://localhost:8001
| Method | Endpoint | Description |
|---|---|---|
GET |
/api/tasks |
List all tasks with filtering |
POST |
/api/tasks |
Create a new task |
GET |
/api/tasks/{id} |
Get specific task by ID |
PUT |
/api/tasks/{id} |
Update existing task |
DELETE |
/api/tasks/{id} |
Delete task |
POST |
/api/tasks/bulk-update |
Update multiple tasks |
| Method | Endpoint | Description |
|---|---|---|
GET |
/api/analytics/metrics |
Get task analytics and metrics |
Task Listing (GET /api/tasks):
status: Filter by status (pending,in_progress,completed,cancelled)priority: Filter by priority (low,medium,high,critical)assignee_id: Filter by assignee IDlimit: Maximum results (default: 50, max: 100)offset: Pagination offset (default: 0)
Analytics (GET /api/analytics/metrics):
timeframe: Time period (day,week,month,year)
class TaskCreate(BaseModel):
title: str = Field(..., min_length=1, max_length=200)
description: Optional[str] = Field(None, max_length=1000)
status: TaskStatus = TaskStatus.pending
assignee_id: Optional[int] = None
priority: TaskPriority = TaskPriority.medium
due_date: Optional[datetime] = None
class TaskResponse(TaskCreate):
id: int
created_at: datetime
updated_at: datetimeclass TaskStatus(str, Enum):
pending = "pending"
in_progress = "in_progress"
completed = "completed"
cancelled = "cancelled"
class TaskPriority(str, Enum):
low = "low"
medium = "medium"
high = "high"
critical = "critical"curl -X POST "http://localhost:8001/api/tasks" \
-H "Content-Type: application/json" \
-d '{
"title": "Implement user authentication",
"description": "Add OAuth2 authentication to the API",
"priority": "high",
"assignee_id": 1
}'curl -X PUT "http://localhost:8001/api/tasks/1" \
-H "Content-Type: application/json" \
-d '{
"status": "completed",
"priority": "high"
}'curl "http://localhost:8001/api/tasks?status=pending&priority=high&limit=10"curl -X POST "http://localhost:8001/api/tasks/bulk-update" \
-H "Content-Type: application/json" \
-d '{
"task_ids": [1, 2, 3],
"update": {
"status": "completed"
}
}'curl "http://localhost:8001/api/analytics/metrics?timeframe=week"CREATE TABLE tasks (
id INTEGER PRIMARY KEY AUTOINCREMENT,
title VARCHAR(200) NOT NULL,
description VARCHAR(1000),
status VARCHAR(20) NOT NULL,
assignee_id INTEGER,
priority VARCHAR(20) NOT NULL,
due_date DATETIME,
created_at DATETIME NOT NULL,
updated_at DATETIME NOT NULL
);DATABASE_URL: Database connection string (default:sqlite:///tasks.db)DEBUG: Enable debug mode (default:False)HOST: Server host (default:0.0.0.0)PORT: Server port (default:8001)
DATABASE_URL=sqlite:///tasks.db
DEBUG=true
HOST=0.0.0.0
PORT=8001pytest tests/# Linting
flake8 .
# Type checking
mypy .
# Formatting
black .The application uses SQLAlchemy with automatic table creation. For production, consider using Alembic for migrations:
pip install alembic
alembic init migrations
alembic revision --autogenerate -m "Initial migration"
alembic upgrade headFROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
EXPOSE 8001
CMD ["python", "main.py"]docker build -t task-backend .
docker run -p 8001:8001 -e DATABASE_URL=sqlite:///data/tasks.db task-backendThe API includes a health check endpoint:
curl http://localhost:8001/
# Response: {"status": "healthy", "service": "Task Management Backend API", "version": "1.0.0"}The API returns structured error responses:
{
"detail": "Task not found"
}200: Success201: Created400: Bad Request404: Not Found422: Validation Error500: Internal Server Error
The backend logs all requests and errors. Configure logging level via environment:
LOG_LEVEL=INFO python main.py- Database Connection Pooling: SQLAlchemy manages connections
- Async Support: FastAPI with async/await for high concurrency
- Pagination: Built-in limit/offset pagination
- Query Optimization: Indexed database queries
- Input Validation: Pydantic models validate all inputs
- SQL Injection Prevention: SQLAlchemy ORM prevents SQL injection
- CORS: Configurable cross-origin resource sharing
- Rate Limiting: Consider adding rate limiting for production
This backend is designed to work with:
- MCP Server: Provides MCP protocol interface
- MCPO Proxy: Exposes via OpenAPI for tools like OpenWebUI
- Frontend: React web interface
- Third-party Apps: Any HTTP client via REST API
pip install gunicorn
gunicorn main:app -w 4 -k uvicorn.workers.UvicornWorker -b 0.0.0.0:8001- Use PostgreSQL or MySQL for production database
- Configure proper logging aggregation
- Set up monitoring and health checks
- Use reverse proxy (nginx) for SSL and load balancing
-
Database Connection Errors:
- Check
DATABASE_URLenvironment variable - Ensure SQLite file permissions for file-based databases
- Verify database server is running for remote databases
- Check
-
Port Already in Use:
- Change port:
PORT=8002 python main.py - Find process using port:
lsof -i :8001
- Change port:
-
Import Errors:
- Activate virtual environment
- Install dependencies:
pip install -r requirements.txt
Enable detailed error messages:
DEBUG=true python main.pyThis will provide stack traces and detailed error information.
MIT