Skip to content

bograh/db-backup-cron

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PostgreSQL Backup Scheduler (Go + Cron + Docker)

Automated PostgreSQL backups using pg_dump, scheduled via cron in a tiny Go service. Ships as a Docker image for easy, reliable daily backups to a host-mounted folder.

What it does

  • Runs a daily backup at 03:00 (container timezone) using pg_dump.
  • Writes .sql dump files into ./backups on your host.
  • Appends a backup.log with success/failure entries.
  • Works with any reachable Postgres (locally, Docker, or cloud providers).
  • Optional: uploads backups to UploadThing and (optionally) deletes the local file after a successful upload.

Backup filename format:

{APP_NAME}_backup_{DB_NAME}_{YYYYMMDD_HHMMSS}.sql

Example: KOVA_backup_railway_20250829_110913.sql

Requirements

  • Docker Desktop (with Docker Compose v2: docker compose)
  • A reachable PostgreSQL instance

Optional (for local, non-Docker runs):

  • Go 1.21+ and pg_dump in PATH

Quick start (Docker Compose on Windows)

  1. Create a .env file next to docker-compose.yml:
APP_NAME=backup
DB_HOST=your-postgres-host
DB_PORT=5432
DB_NAME=your_database
DB_USER=your_user
DB_PASSWORD=your_password
# Inside the container we write to /app/backups; mapped to ./backups on the host
BACKUP_DIR=/app/backups
# Optional: set container timezone for the cron schedule
TZ=UTC

# Optional: UploadThing (cloud upload)
UPLOADTHING_ENABLED=false
UPLOADTHING_API_KEY=your_uploadthing_api_key
# If true, delete the local .sql file after a successful upload
UPLOADTHING_DELETE_LOCAL=false
  1. Build and start:
docker compose build
docker compose up -d
  1. Verify the service and logs:
# Follow container logs
docker compose logs -f postgres-backup-scheduler
  1. Check created files on your host:
  • Backups: ./backups/*.sql
  • Log file: ./backups/backup.log

The first backup runs immediately on start, then daily at 03:00.

Configuration

The app reads environment variables (no implicit .env loading when running the binary directly):

  • DB_HOST (required)
  • DB_PORT (default: 5432)
  • DB_NAME (required)
  • DB_USER (required)
  • DB_PASSWORD (required)
  • BACKUP_DIR (default: ./backups; in Docker use /app/backups)
  • APP_NAME (used in filename; e.g., backup)
  • TZ (optional; container timezone for the cron schedule)
  • UploadThing (optional cloud upload):
    • UPLOADTHING_ENABLED (true/false, default disabled)
    • UPLOADTHING_API_KEY (required if enabled)
    • UPLOADTHING_DELETE_LOCAL (true/false): delete local file after a successful upload

Compose mounts:

  • ./backups -> /app/backups (read/write)
  • ./logs -> /app/logs (optional; current app writes backup.log to /app/backups)

Schedule

  • Cron expression in code: 0 0 3 * * * (with seconds enabled) = every day at 03:00.
  • Timezone comes from the container (TZ). Set TZ in .env if you want local time (e.g., TZ=Europe/London).

To change the schedule, update main.go and rebuild:

_, err = c.AddFunc("0 0 3 * * *", func() {
		runBackupJob(config)
})

Restore hints

Use psql to restore a .sql dump (adjust host/port/db/user as needed):

psql -h <host> -p 5432 -U <user> -d <db> -f backups\<file.sql>

Run locally without Docker (Windows)

Make sure pg_dump is available in your PATH (from PostgreSQL client tools).

  1. Set env vars in the current shell:
set APP_NAME=backup
set DB_HOST=localhost
set DB_PORT=5432
set DB_NAME=your_database
set DB_USER=your_user
set DB_PASSWORD=your_password
set BACKUP_DIR=backups
  1. Build and run:
go build -o backup-scheduler.exe
backup-scheduler.exe
  • Backups and backup.log will appear under %CD%\backups (or whatever BACKUP_DIR you set).
  • The first backup runs immediately, then daily at 03:00.

How it works

  • main.go reads config from environment variables.
  • performBackup calls pg_dump with PGPASSWORD injected via env.
  • Dumps are written to BACKUP_DIR; backup.log logs successes/errors with timestamps and file sizes.
  • Cron from github.com/robfig/cron/v3 schedules the job with second precision.
  • Dockerfile builds a static Go binary, then runs it in a postgres:17-alpine image (which includes pg_dump).
  • If UploadThing is enabled, performBackup loads config (LoadUploadConfig) and calls UploadWithRetry from upload.go:
    • Up to 3 retry attempts with a 30s delay are performed on failures.
    • On success, an upload entry is added to backup.log, and the local file is optionally deleted based on UPLOADTHING_DELETE_LOCAL.

Cloud upload (UploadThing)

When UPLOADTHING_ENABLED=true, the app will upload each completed backup to UploadThing:

  • Required: UPLOADTHING_API_KEY (get from your UploadThing dashboard).
  • Upload timeout: 10 minutes (suitable for large dumps; adjust in code if needed).
  • Logging: success/failure is appended to backup.log in the same directory as local backups.
  • Local cleanup: enable UPLOADTHING_DELETE_LOCAL=true to remove the local .sql after a successful upload.
  • Retries: 3 attempts with 30 seconds between attempts.
  • API: Uses UploadThing's v6 presigned URL flow for reliable uploads.

Troubleshooting

  • No files in ./backups:
    • Ensure the service is running: docker compose ps and check logs.
    • Confirm DB_HOST, credentials, and network reachability.
    • On Windows, verify the ./backups folder exists and is shared with Docker Desktop.
  • Permission denied writing backups:
    • The container runs as a non-root user. The bind-mounted ./backups must allow writes.
  • Timezone confusion:
    • Set TZ to your desired zone so 03:00 matches your expectation.
  • pg_dump not found (local run only):
    • Install PostgreSQL client tools and add them to PATH.
  • UploadThing issues:
    • "Not found" error: Verify your UPLOADTHING_API_KEY is correct and active in your UploadThing dashboard.
    • Missing credentials: Ensure UPLOADTHING_ENABLED=true only when UPLOADTHING_API_KEY is set.
    • Network issues: Check outbound HTTPS access from the container/host to api.uploadthing.com.
    • Large files: The client uses a 10-minute timeout per request; very large dumps may need adjustment.
    • Debugging: Review backup.log for detailed upload error messages and retry status.
    • API limits: Check your UploadThing plan limits for file size and monthly usage.

Notes

  • There is no built-in retention policy; old backups accumulate. Rotate or prune files as needed (e.g., host cron or a simple script).
  • ./logs mount is optional—the app currently logs to backup.log inside BACKUP_DIR.

If you need schedule by env, retention, compression, or S3 uploads, those can be added—open to enhancements.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors