A Celery-based RTSP camera recording system that captures 5-minute video chunks from multiple cameras, uploads them to S3, and automatically cleans up old recordings.
- Multi-camera recording — supports multiple RTSP cameras with configurable recording intervals
- Automatic S3 upload — uploads recordings to AWS S3 with retry logic and proper metadata
- Retention management — daily cleanup task removes old recordings from both local storage and S3
- Secure credentials — camera and AWS credentials managed via environment variables, never stored in code
- Remote deployment — helper script for managing remote recording servers
┌─────────────┐ ┌──────────────┐ ┌─────────────┐
│ RTSP Cam 1 │────▶│ Celery │────▶│ AWS S3 │
│ RTSP Cam N │────▶│ Worker │────▶│ Bucket │
└─────────────┘ └──────────────┘ └─────────────┘
│
┌────▼─────┐
│ Celery │
│ Beat │
└──────────┘
Local: recordings/YYYY-MM-DD/camera_name/HH-MM-SS.mp4
S3: s3://bucket/YYYY-MM-DD/camera_name/HH-MM-SS.mp4
- Python 3.13+
- FFmpeg
- Redis (Celery broker)
- AWS account with S3 bucket
# Clone the repository
git clone [email protected]:ronoc/lahardan.git
cd lahardan
# Create virtual environment
python3 -m venv .venv
source .venv/bin/activate
# Install dependencies
pip install -r requirements.txt-
Copy environment template:
cp .env.example .env
-
Set camera credentials in
.env:CAMERA_USER=admin CAMERA_PASS=your_camera_password AWS_PROFILE=chickencoop
-
Configure AWS credentials:
aws configure --profile chickencoop
Or add to
~/.aws/credentials:[chickencoop] aws_access_key_id = YOUR_ACCESS_KEY aws_secret_access_key = YOUR_SECRET_KEY
-
Edit
cameras.yaml:- Set your S3 bucket name
- Add your camera host IPs
- Configure retention days
The AWS user needs permission to write to the S3 bucket:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:AbortMultipartUpload",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::your-bucket-name",
"arn:aws:s3:::your-bucket-name/*"
]
}
]
}# Source environment variables
set -a; source .env; set +a
# Start worker (handles recording and upload tasks)
celery -A tasks worker --loglevel=info --concurrency=10
# Start beat (schedules periodic tasks)
celery -A tasks beat --loglevel=infoFor remote server management:
./deploy.sh sync # Copy files to remote server
./deploy.sh restart # Start worker + beat in background
./deploy.sh status # Check running processes
./deploy.sh stop # Stop all Celery processes| Task | Schedule | Description |
|---|---|---|
record_chunk |
Every 5 min per camera | Records 5-minute RTSP chunks via FFmpeg |
upload_chunk |
Triggered by record | Uploads successful recordings to S3 |
cleanup_old_recordings |
Every 24 hours | Deletes local + S3 recordings older than retention period |
output_dir: "./recordings"
retention_days: 30 # Delete recordings older than 30 days
s3:
bucket: "your-bucket-name"
region: "eu-west-1"
profile: "chickencoop"
prefix: ""
delete_after_upload: false
cameras:
- name: "front_door"
host: "192.168.1.100"
- name: "backyard"
host: "192.168.1.101"| Variable | Description | Default |
|---|---|---|
CAMERA_USER |
RTSP camera username | admin |
CAMERA_PASS |
RTSP camera password | (required) |
AWS_PROFILE |
AWS CLI profile name | chickencoop |
# Check Celery status
celery -A tasks inspect active
# View recent tasks
celery -A tasks inspect reserved
# Check disk usage
du -sh recordings/MIT License - Copyright (c) 2026 Conor Curran