Skip to main content
The application writes logs to files (Flask and Celery) and to Docker stdout; a daily backup task collects both into a shared backup directory, and a weekly task uploads those backups to S3. This page describes where logs are, how to view and save them, how backup and S3 work, and which services/logs are included.

Where logs live

Application log files (inside containers)

  • logs/app.log — Main Flask app (app.py). Configured in the app with a TimedRotatingFileHandler: rotates at midnight, keeps 7 days of backups (suffix app.log.YYYY-MM-DD). Writes under the app’s LOG_DIR (e.g. /app/logs in Docker). Also logs to console (StreamHandler).
  • logs/celery.log — Celery worker and Beat (tasks.py). Same pattern: TimedRotatingFileHandler at midnight, 7 days retention, plus console. Written in the celery and beat containers (each has its own /app/logs/celery.log).
So at runtime, logs are under logs/ in the project (or /app/logs inside the web/celery/beat containers). Rotated files are named like app.log.2025-02-23, celery.log.2025-02-23.

Docker logs

Each service (web, celery, beat, admin-portal, redis, caddy, db, etc.) also has Docker logs — whatever the process writes to stdout/stderr. Docker stores these according to the daemon’s logging driver (e.g. json-file with max-size and max-file as in docker-compose.yml). You can view them with docker compose logs.

Test message output

Generated WhatsApp test messages (from the test routes) are saved under logs/generated_whatsapp_messages/ as JSON files. That directory is created by app.py; it is not part of the rotating app.log or the backup’s “application logs” list, but the backup process can include it if that path is under the same logs/ tree that gets copied.

Viewing and saving logs

View Docker logs (live or one-off)

From the project root (where docker-compose.yml is):
# All services, follow
docker compose logs -f

# Specific services
docker compose logs -f web
docker compose logs -f celery
docker compose logs -f beat
docker compose logs -f admin-portal

# Last N lines
docker compose logs --tail=100 web
To save to a file: e.g. docker compose logs web > web_logs.txt or docker compose logs --tail=1000 web > web_recent.txt.

View application log files (inside containers)

# Flask app log
docker compose exec web tail -f /app/logs/app.log

# Celery log (from worker or beat container)
docker compose exec celery tail -f /app/logs/celery.log
docker compose exec beat tail -f /app/logs/celery.log
To copy out: docker cp <container>:/app/logs/app.log ./saved_app.log.

Admin portal Log viewer

The Admin portal exposes /api/logs/dates, /api/logs/files, /api/logs/content, /api/logs/errors, and /api/logs/download. These read from the backup directory (/backups/logs), not directly from the running containers. So you see backup content (by date and file) and can download files or archives. Useful after the daily backup has run and the admin-portal container has access to the same volume.

How backup works (daily task)

  • Who runs it: Celery Beat runs the task tasks.backup_logs_task once per day at 00:00 UTC (see Scheduling for the full Beat schedule).
  • What it does: The task calls backup_all_logs() in helpers/log_collector.py. By default it backs up yesterday’s logs (target date = today − 1 day). It can also be run manually with backup_current=True to capture the current state of containers and app logs (target date = today).
  • Where it writes: Under /backups/logs/ (inside the container; in Docker Compose this is typically a host volume such as ./backups). Structure:
    • /backups/logs/<YYYY-MM-DD>/docker/ — One file per service: web.log, celery.log, beat.log, admin-portal.log, redis.log, caddy.log, db.log. Content comes from docker compose logs (for the last 24 hours in scheduled mode, or all current logs in “current” mode).
    • /backups/logs/<YYYY-MM-DD>/application/ — Copies of app.log and celery.log (and any rotated variants like app.log.YYYY-MM-DD) from the application log directory (e.g. /app/logs). The collector looks in /app/logs or the project’s logs/ directory.
  • Which services are collected: Defined in log_collector.DOCKER_SERVICES: web, celery, beat, admin-portal, redis, caddy, db. Application log files are listed in APPLICATION_LOG_FILES: app.log, celery.log.
  • After backup: For scheduled runs, the collector also runs compress_old_logs(days=7): backup directories older than 7 days are tar.gzed (and the original directory removed) to save space.
So: once per day at midnight UTC, the previous day’s Docker logs and app/celery logs are written under /backups/logs/<date>/. Older backup directories are then compressed.

How logs are backed up to S3 (frequency)

  • Who runs it: Celery Beat runs tasks.upload_logs_to_s3_task once per week on Sunday at 02:00 UTC.
  • What it does: The task calls upload_logs_to_s3() in helpers/s3_uploader.py. By default it uploads the previous week (Monday–Sunday). For each day in that range it looks under /backups/logs/<date> (or <date>.tar.gz if already compressed), builds a single tar.gz if needed, and uploads it to S3. Optional: delete local backup after a successful upload (delete_local=True).
  • S3 layout: Objects are stored under a key like logs/<year>/<month>/week-<week_number>/<YYYY-MM-DD>.tar.gz (week number is a simple 7-day bucket from Jan 1). Metadata (date, year, month, week, etc.) is set on the object.
  • Configuration: S3_BUCKET_NAME must be set (e.g. in env or docker-compose) for the upload to run. The celery/beat services need AWS credentials (e.g. IAM role or env vars) so that boto3 can write to the bucket.
So: weekly (Sunday 02:00 UTC), the previous week’s daily backups are uploaded to S3; local backups can be removed after upload to save disk.

Summary

WhatWhereWhen / how
Flask app loglogs/app.log (and rotated app.log.YYYY-MM-DD)TimedRotatingFileHandler, midnight, 7 days; also console.
Celery loglogs/celery.log (and rotated)Same in celery/beat containers.
Docker logsDaemon (e.g. json-file)Per service; view with docker compose logs.
Daily backup/backups/logs/<date>/docker/*.log, .../application/*.logCelery task at 00:00 UTC; collects Docker + app/celery logs; compresses dirs older than 7 days.
S3 backups3://<bucket>/logs/<year>/<month>/week-<n>/<date>.tar.gzCelery task Sunday 02:00 UTC; uploads previous week’s backups; needs S3_BUCKET_NAME and AWS credentials.
View in UIAdmin portal LogsReads from /backups/logs (dates, files, content, download).
All backup and upload logic lives in helpers/log_collector.py and helpers/s3_uploader.py; the Celery tasks that run them are in tasks.py (see Scheduling for the Beat schedule).