diff --git a/docs/superpowers/plans/2026-03-14-setup-script.md b/docs/superpowers/plans/2026-03-14-setup-script.md
new file mode 100644
index 0000000..4b61595
--- /dev/null
+++ b/docs/superpowers/plans/2026-03-14-setup-script.md
@@ -0,0 +1,1196 @@
+# Setup Script Implementation Plan
+
+> **For agentic workers:** REQUIRED: Use superpowers:subagent-driven-development (if subagents available) or superpowers:executing-plans to implement this plan. Steps use checkbox (`- [ ]`) syntax for tracking.
+
+**Goal:** Create `setup.py` — an interactive production setup wizard that configures `.env.prod`, bootstraps OpenBao, builds Docker images, starts the stack, and verifies health.
+
+**Architecture:** Single-file Python script using only stdlib. Linear wizard flow with grouped sections. Auto-generates all secrets and numeric config; only prompts for ~6 human decisions. Integrates OpenBao bootstrap by starting its container and parsing credentials from logs.
+
+**Tech Stack:** Python 3.10+ stdlib (`secrets`, `subprocess`, `getpass`, `socket`, `pathlib`, `re`, `datetime`, `shutil`, `sys`, `signal`, `textwrap`)
+
+**Spec:** `docs/superpowers/specs/2026-03-14-setup-script-design.md`
+
+---
+
+## File Map
+
+| Action | File | Responsibility |
+|--------|------|----------------|
+| Create | `setup.py` | Interactive setup wizard (root of project) |
+| Modify | `docker-compose.yml:21,30` | Change `mikrotik` → `${POSTGRES_DB:-tod}` in default + healthcheck |
+| Modify | `docker-compose.prod.yml:68` | Change hardcoded poller DATABASE_URL to `${POLLER_DATABASE_URL}` |
+| Modify | `scripts/init-postgres.sql:6,26` | Change `mikrotik` → `tod` in GRANT statements |
+| Modify | `.env.example:17,20-22,25,51` | Change `mikrotik` → `tod` in all references + CORS comment |
+| Modify | `.env.staging.example:9,13-15,18` | Change `mikrotik` → `tod` in all references |
+| Modify | `frontend/src/routes/login.tsx:235-241` | Wrap dev hint in `import.meta.env.DEV` guard |
+
+---
+
+## Chunk 1: Database Rename & Login Fix
+
+### Task 1: Rename database from `mikrotik` to `tod`
+
+**Files:**
+- Modify: `docker-compose.yml:21,30`
+- Modify: `docker-compose.prod.yml:68`
+- Modify: `scripts/init-postgres.sql:6,26`
+- Modify: `.env.example:17,20-22,25,51`
+- Modify: `.env.staging.example:9,13-15,18`
+
+- [ ] **Step 1: Update docker-compose.yml default and healthcheck**
+
+In `docker-compose.yml`, change line 21:
+```yaml
+ POSTGRES_DB: ${POSTGRES_DB:-tod}
+```
+
+Change line 30:
+```yaml
+ test: ["CMD-SHELL", "pg_isready -U postgres -d ${POSTGRES_DB:-tod}"]
+```
+
+- [ ] **Step 2: Update docker-compose.prod.yml poller DATABASE_URL**
+
+In `docker-compose.prod.yml`, change line 68 from:
+```yaml
+ DATABASE_URL: postgres://poller_user:poller_password@postgres:5432/mikrotik
+```
+to:
+```yaml
+ DATABASE_URL: ${POLLER_DATABASE_URL:-postgres://poller_user:poller_password@postgres:5432/tod}
+```
+
+- [ ] **Step 3: Update init-postgres.sql**
+
+In `scripts/init-postgres.sql`, change line 6:
+```sql
+GRANT CONNECT ON DATABASE tod TO app_user;
+```
+
+Change line 26:
+```sql
+GRANT CONNECT ON DATABASE tod TO poller_user;
+```
+
+- [ ] **Step 4: Update .env.example**
+
+Replace all `mikrotik` references with `tod`, including the CORS comment on line 51:
+```
+POSTGRES_DB=tod
+DATABASE_URL=postgresql+asyncpg://postgres:postgres@postgres:5432/tod
+SYNC_DATABASE_URL=postgresql+psycopg2://postgres:postgres@postgres:5432/tod
+APP_USER_DATABASE_URL=postgresql+asyncpg://app_user:app_password@postgres:5432/tod
+POLLER_DATABASE_URL=postgres://poller_user:poller_password@postgres:5432/tod
+```
+
+Line 51 change:
+```
+# Prod: set to your actual domain, e.g., https://tod.yourdomain.com
+```
+
+- [ ] **Step 4b: Update .env.staging.example**
+
+Replace all `mikrotik` references with `tod` in `.env.staging.example`:
+```
+POSTGRES_DB=tod
+DATABASE_URL=postgresql+asyncpg://postgres:CHANGE_ME_STAGING@postgres:5432/tod
+SYNC_DATABASE_URL=postgresql+psycopg2://postgres:CHANGE_ME_STAGING@postgres:5432/tod
+APP_USER_DATABASE_URL=postgresql+asyncpg://app_user:CHANGE_ME_STAGING@postgres:5432/tod
+POLLER_DATABASE_URL=postgres://poller_user:poller_password@postgres:5432/tod
+```
+
+- [ ] **Step 5: Commit**
+
+```bash
+git add docker-compose.yml docker-compose.prod.yml scripts/init-postgres.sql .env.example .env.staging.example
+git commit -m "refactor: rename database from mikrotik to tod"
+```
+
+### Task 2: Hide login page dev hint in production
+
+**Files:**
+- Modify: `frontend/src/routes/login.tsx:235-241`
+
+- [ ] **Step 1: Wrap dev hint in DEV guard**
+
+In `frontend/src/routes/login.tsx`, replace lines 235-241:
+```tsx
+ {/* First-run hint */}
+
+
+ First time? Use the credentials from your .env file
+ (FIRST_ADMIN_EMAIL / FIRST_ADMIN_PASSWORD).
+
+
+```
+
+With:
+```tsx
+ {/* First-run hint (dev only) */}
+ {import.meta.env.DEV && (
+
+
+ First time? Use the credentials from your .env file
+ (FIRST_ADMIN_EMAIL / FIRST_ADMIN_PASSWORD).
+
+
+ )}
+```
+
+- [ ] **Step 2: Commit**
+
+```bash
+git add frontend/src/routes/login.tsx
+git commit -m "fix: hide first-run credential hint in production builds"
+```
+
+---
+
+## Chunk 2: Setup Script — Helpers & Pre-flight
+
+### Task 3: Create setup.py with helpers and pre-flight checks
+
+**Files:**
+- Create: `setup.py`
+
+- [ ] **Step 1: Write the script header, color helpers, and pre-flight checks**
+
+Create `setup.py` with:
+
+```python
+#!/usr/bin/env python3
+"""TOD Production Setup Wizard.
+
+Interactive setup script that configures .env.prod, bootstraps OpenBao,
+builds Docker images, starts the stack, and verifies service health.
+
+Usage:
+ python3 setup.py
+"""
+
+import base64
+import datetime
+import getpass
+import os
+import pathlib
+import re
+import secrets
+import shutil
+import signal
+import socket
+import subprocess
+import sys
+import textwrap
+import time
+
+# ── Constants ────────────────────────────────────────────────────────────────
+
+PROJECT_ROOT = pathlib.Path(__file__).resolve().parent
+ENV_PROD = PROJECT_ROOT / ".env.prod"
+INIT_SQL_TEMPLATE = PROJECT_ROOT / "scripts" / "init-postgres.sql"
+INIT_SQL_PROD = PROJECT_ROOT / "scripts" / "init-postgres-prod.sql"
+COMPOSE_BASE = "docker-compose.yml"
+COMPOSE_PROD = "docker-compose.prod.yml"
+COMPOSE_CMD = [
+ "docker", "compose",
+ "-f", COMPOSE_BASE,
+ "-f", COMPOSE_PROD,
+]
+
+REQUIRED_PORTS = {
+ 5432: "PostgreSQL",
+ 6379: "Redis",
+ 4222: "NATS",
+ 8001: "API",
+ 3000: "Frontend",
+ 51820: "WireGuard (UDP)",
+}
+
+
+# ── Color helpers ────────────────────────────────────────────────────────────
+
+def _supports_color() -> bool:
+ return hasattr(sys.stdout, "isatty") and sys.stdout.isatty()
+
+_COLOR = _supports_color()
+
+def _c(code: str, text: str) -> str:
+ return f"\033[{code}m{text}\033[0m" if _COLOR else text
+
+def green(t: str) -> str: return _c("32", t)
+def yellow(t: str) -> str: return _c("33", t)
+def red(t: str) -> str: return _c("31", t)
+def cyan(t: str) -> str: return _c("36", t)
+def bold(t: str) -> str: return _c("1", t)
+def dim(t: str) -> str: return _c("2", t)
+
+
+def banner(text: str) -> None:
+ width = 62
+ print()
+ print(cyan("=" * width))
+ print(cyan(f" {text}"))
+ print(cyan("=" * width))
+ print()
+
+
+def section(text: str) -> None:
+ print()
+ print(bold(f"--- {text} ---"))
+ print()
+
+
+def ok(text: str) -> None:
+ print(f" {green('✓')} {text}")
+
+
+def warn(text: str) -> None:
+ print(f" {yellow('!')} {text}")
+
+
+def fail(text: str) -> None:
+ print(f" {red('✗')} {text}")
+
+
+def info(text: str) -> None:
+ print(f" {dim('·')} {text}")
+
+
+# ── Input helpers ────────────────────────────────────────────────────────────
+
+def ask(prompt: str, default: str = "", required: bool = False,
+ secret: bool = False, validate=None) -> str:
+ """Prompt the user for input with optional default, validation, and secret mode."""
+ suffix = f" [{default}]" if default else ""
+ full_prompt = f" {prompt}{suffix}: "
+
+ while True:
+ if secret:
+ value = getpass.getpass(full_prompt)
+ else:
+ value = input(full_prompt)
+
+ value = value.strip()
+ if not value and default:
+ value = default
+
+ if required and not value:
+ warn("This field is required.")
+ continue
+
+ if validate:
+ error = validate(value)
+ if error:
+ warn(error)
+ continue
+
+ return value
+
+
+def ask_yes_no(prompt: str, default: bool = False) -> bool:
+ """Ask a yes/no question."""
+ hint = "Y/n" if default else "y/N"
+ while True:
+ answer = input(f" {prompt} [{hint}]: ").strip().lower()
+ if not answer:
+ return default
+ if answer in ("y", "yes"):
+ return True
+ if answer in ("n", "no"):
+ return False
+ warn("Please enter y or n.")
+
+
+def mask_secret(value: str) -> str:
+ """Show first 8 chars of a secret, mask the rest."""
+ if len(value) <= 8:
+ return value
+ return value[:8] + "..."
+
+
+# ── Validators ───────────────────────────────────────────────────────────────
+
+def validate_password_strength(value: str) -> str | None:
+ if len(value) < 12:
+ return "Password must be at least 12 characters."
+ return None
+
+
+def validate_email(value: str) -> str | None:
+ if not re.match(r"^[^@\s]+@[^@\s]+\.[^@\s]+$", value):
+ return "Please enter a valid email address."
+ return None
+
+
+def validate_domain(value: str) -> str | None:
+ # Strip protocol if provided
+ cleaned = re.sub(r"^https?://", "", value).rstrip("/")
+ if not re.match(r"^[a-zA-Z0-9]([a-zA-Z0-9\-]*\.)+[a-zA-Z]{2,}$", cleaned):
+ return "Please enter a valid domain (e.g. tod.example.com)."
+ return None
+
+
+# ── System checks ────────────────────────────────────────────────────────────
+
+def check_python_version() -> bool:
+ if sys.version_info < (3, 10):
+ fail(f"Python 3.10+ required, found {sys.version}")
+ return False
+ ok(f"Python {sys.version_info.major}.{sys.version_info.minor}")
+ return True
+
+
+def check_docker() -> bool:
+ try:
+ result = subprocess.run(
+ ["docker", "info"],
+ capture_output=True, text=True, timeout=10,
+ )
+ if result.returncode != 0:
+ fail("Docker is not running. Start Docker and try again.")
+ return False
+ ok("Docker Engine")
+ except FileNotFoundError:
+ fail("Docker is not installed.")
+ return False
+ except subprocess.TimeoutExpired:
+ fail("Docker is not responding.")
+ return False
+
+ try:
+ result = subprocess.run(
+ ["docker", "compose", "version"],
+ capture_output=True, text=True, timeout=10,
+ )
+ if result.returncode != 0:
+ fail("Docker Compose v2 is not available.")
+ return False
+ version_match = re.search(r"v?(\d+\.\d+)", result.stdout)
+ version_str = version_match.group(1) if version_match else "unknown"
+ ok(f"Docker Compose v{version_str}")
+ except FileNotFoundError:
+ fail("Docker Compose is not installed.")
+ return False
+
+ return True
+
+
+def check_ram() -> None:
+ try:
+ if sys.platform == "darwin":
+ result = subprocess.run(
+ ["sysctl", "-n", "hw.memsize"],
+ capture_output=True, text=True, timeout=5,
+ )
+ ram_bytes = int(result.stdout.strip())
+ else:
+ with open("/proc/meminfo") as f:
+ for line in f:
+ if line.startswith("MemTotal:"):
+ ram_bytes = int(line.split()[1]) * 1024
+ break
+ else:
+ return
+
+ ram_gb = ram_bytes / (1024 ** 3)
+ if ram_gb < 4:
+ warn(f"Only {ram_gb:.1f} GB RAM detected. 4 GB+ recommended for builds.")
+ else:
+ ok(f"{ram_gb:.1f} GB RAM")
+ except Exception:
+ info("Could not detect RAM — skipping check")
+
+
+def check_ports() -> None:
+ for port, service in REQUIRED_PORTS.items():
+ try:
+ with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:
+ s.settimeout(1)
+ result = s.connect_ex(("127.0.0.1", port))
+ if result == 0:
+ warn(f"Port {port} ({service}) is already in use")
+ else:
+ ok(f"Port {port} ({service}) is free")
+ except Exception:
+ info(f"Could not check port {port} ({service})")
+
+
+def check_existing_env() -> str:
+ """Check for existing .env.prod. Returns 'overwrite', 'backup', or 'abort'."""
+ if not ENV_PROD.exists():
+ return "overwrite"
+
+ print()
+ warn(f"Existing .env.prod found at {ENV_PROD}")
+ print()
+ print(" What would you like to do?")
+ print(f" {bold('1)')} Overwrite it")
+ print(f" {bold('2)')} Back it up and create a new one")
+ print(f" {bold('3)')} Abort")
+ print()
+
+ while True:
+ choice = input(" Choice [1/2/3]: ").strip()
+ if choice == "1":
+ return "overwrite"
+ elif choice == "2":
+ ts = datetime.datetime.now().strftime("%Y%m%dT%H%M%S")
+ backup = ENV_PROD.with_name(f".env.prod.backup.{ts}")
+ shutil.copy2(ENV_PROD, backup)
+ ok(f"Backed up to {backup.name}")
+ return "overwrite"
+ elif choice == "3":
+ return "abort"
+ else:
+ warn("Please enter 1, 2, or 3.")
+
+
+def preflight() -> bool:
+ """Run all pre-flight checks. Returns True if OK to proceed."""
+ banner("TOD Production Setup")
+ print(" This wizard will configure your production environment,")
+ print(" generate secrets, bootstrap OpenBao, build images, and")
+ print(" start the stack.")
+ print()
+
+ section("Pre-flight Checks")
+
+ if not check_python_version():
+ return False
+ if not check_docker():
+ return False
+ check_ram()
+ check_ports()
+
+ action = check_existing_env()
+ if action == "abort":
+ print()
+ info("Setup aborted.")
+ return False
+
+ return True
+```
+
+- [ ] **Step 2: Make executable**
+
+```bash
+chmod +x setup.py
+```
+
+- [ ] **Step 3: Verify script loads without errors**
+
+Run: `cd /Volumes/ssd01/v9/the-other-dude && python3 -c "import setup; print('OK')"`
+Expected: `OK`
+
+- [ ] **Step 4: Commit**
+
+```bash
+git add setup.py
+git commit -m "feat(setup): add helpers, validators, and pre-flight checks"
+```
+
+---
+
+## Chunk 3: Setup Script — Wizard Sections & Env Generation
+
+### Task 4: Add wizard sections and .env.prod generation
+
+**Files:**
+- Modify: `setup.py`
+
+- [ ] **Step 1: Add the wizard configuration functions**
+
+Append to `setup.py` before the end:
+
+```python
+# ── Secret generation ────────────────────────────────────────────────────────
+
+def generate_jwt_secret() -> str:
+ return secrets.token_urlsafe(64)
+
+
+def generate_encryption_key() -> str:
+ return base64.b64encode(secrets.token_bytes(32)).decode()
+
+
+def generate_db_password() -> str:
+ return secrets.token_urlsafe(24)
+
+
+def generate_admin_password() -> str:
+ return secrets.token_urlsafe(18)
+
+
+# ── Wizard sections ─────────────────────────────────────────────────────────
+
+def wizard_database(config: dict) -> None:
+ section("Database")
+ info("PostgreSQL superuser password — used for migrations and admin operations.")
+ info("The app and poller service passwords will be auto-generated.")
+ print()
+
+ config["postgres_password"] = ask(
+ "PostgreSQL superuser password",
+ required=True,
+ secret=True,
+ validate=validate_password_strength,
+ )
+
+ config["app_user_password"] = generate_db_password()
+ config["poller_user_password"] = generate_db_password()
+ config["postgres_db"] = "tod"
+
+ ok("Database passwords configured")
+ info(f"app_user password: {mask_secret(config['app_user_password'])}")
+ info(f"poller_user password: {mask_secret(config['poller_user_password'])}")
+
+
+def wizard_security(config: dict) -> None:
+ section("Security")
+ info("Auto-generating cryptographic keys...")
+ print()
+
+ config["jwt_secret"] = generate_jwt_secret()
+ config["encryption_key"] = generate_encryption_key()
+
+ ok("JWT signing key generated")
+ ok("Credential encryption key generated")
+ print()
+ warn("Save these somewhere safe — they cannot be recovered if lost:")
+ info(f"JWT_SECRET_KEY={mask_secret(config['jwt_secret'])}")
+ info(f"CREDENTIAL_ENCRYPTION_KEY={mask_secret(config['encryption_key'])}")
+
+
+def wizard_admin(config: dict) -> None:
+ section("Admin Account")
+ info("The first admin account is created on initial startup.")
+ print()
+
+ config["admin_email"] = ask(
+ "Admin email",
+ default="admin@the-other-dude.dev",
+ required=True,
+ validate=validate_email,
+ )
+
+ print()
+ info("Enter a password or press Enter to auto-generate one.")
+ password = ask("Admin password", secret=True)
+
+ if password:
+ error = validate_password_strength(password)
+ while error:
+ warn(error)
+ password = ask("Admin password", secret=True, required=True,
+ validate=validate_password_strength)
+ error = None # ask() already validated
+ config["admin_password"] = password
+ config["admin_password_generated"] = False
+ else:
+ config["admin_password"] = generate_admin_password()
+ config["admin_password_generated"] = True
+ ok(f"Generated password: {bold(config['admin_password'])}")
+ warn("Save this now — it will not be shown again after setup.")
+
+
+def wizard_email(config: dict) -> None:
+ section("Email (SMTP)")
+ info("Email is used for password reset links.")
+ print()
+
+ if not ask_yes_no("Configure SMTP now?", default=False):
+ config["smtp_configured"] = False
+ info("Skipped — you can re-run setup.py later to configure email.")
+ return
+
+ config["smtp_configured"] = True
+ config["smtp_host"] = ask("SMTP host", required=True)
+ config["smtp_port"] = ask("SMTP port", default="587")
+ config["smtp_user"] = ask("SMTP username (optional)")
+ config["smtp_password"] = ask("SMTP password (optional)", secret=True) if config["smtp_user"] else ""
+ config["smtp_from"] = ask("From address", required=True, validate=validate_email)
+ config["smtp_tls"] = ask_yes_no("Use TLS?", default=True)
+
+
+def wizard_domain(config: dict) -> None:
+ section("Web / Domain")
+ info("Your production domain, used for CORS and email links.")
+ print()
+
+ raw = ask("Production domain (e.g. tod.example.com)", required=True, validate=validate_domain)
+ domain = re.sub(r"^https?://", "", raw).rstrip("/")
+ config["domain"] = domain
+ config["app_base_url"] = f"https://{domain}"
+ config["cors_origins"] = f"https://{domain}"
+
+ ok(f"APP_BASE_URL=https://{domain}")
+ ok(f"CORS_ORIGINS=https://{domain}")
+
+
+# ── Summary ──────────────────────────────────────────────────────────────────
+
+def show_summary(config: dict) -> bool:
+ banner("Configuration Summary")
+
+ print(f" {bold('Database')}")
+ print(f" POSTGRES_DB = {config['postgres_db']}")
+ print(f" POSTGRES_PASSWORD = {mask_secret(config['postgres_password'])}")
+ print(f" app_user password = {mask_secret(config['app_user_password'])}")
+ print(f" poller_user password = {mask_secret(config['poller_user_password'])}")
+ print()
+
+ print(f" {bold('Security')}")
+ print(f" JWT_SECRET_KEY = {mask_secret(config['jwt_secret'])}")
+ print(f" ENCRYPTION_KEY = {mask_secret(config['encryption_key'])}")
+ print()
+
+ print(f" {bold('Admin Account')}")
+ print(f" Email = {config['admin_email']}")
+ print(f" Password = {'(auto-generated)' if config.get('admin_password_generated') else mask_secret(config['admin_password'])}")
+ print()
+
+ print(f" {bold('Email')}")
+ if config.get("smtp_configured"):
+ print(f" SMTP_HOST = {config['smtp_host']}")
+ print(f" SMTP_PORT = {config['smtp_port']}")
+ print(f" SMTP_FROM = {config['smtp_from']}")
+ print(f" SMTP_TLS = {config['smtp_tls']}")
+ else:
+ print(f" {dim('(not configured)')}")
+ print()
+
+ print(f" {bold('Web')}")
+ print(f" Domain = {config['domain']}")
+ print(f" APP_BASE_URL = {config['app_base_url']}")
+ print()
+
+ print(f" {bold('OpenBao')}")
+ print(f" {dim('(will be captured automatically during bootstrap)')}")
+ print()
+
+ return ask_yes_no("Write .env.prod with these settings?", default=True)
+```
+
+- [ ] **Step 2: Add the .env.prod writer and init SQL generator**
+
+Append to `setup.py`:
+
+```python
+# ── File writers ─────────────────────────────────────────────────────────────
+
+def write_env_prod(config: dict) -> None:
+ """Write the .env.prod file."""
+ db = config["postgres_db"]
+ pg_pw = config["postgres_password"]
+ app_pw = config["app_user_password"]
+ poll_pw = config["poller_user_password"]
+ ts = datetime.datetime.now().isoformat(timespec="seconds")
+
+ smtp_block = ""
+ if config.get("smtp_configured"):
+ smtp_block = f"""\
+SMTP_HOST={config['smtp_host']}
+SMTP_PORT={config['smtp_port']}
+SMTP_USER={config.get('smtp_user', '')}
+SMTP_PASSWORD={config.get('smtp_password', '')}
+SMTP_USE_TLS={'true' if config.get('smtp_tls') else 'false'}
+SMTP_FROM_ADDRESS={config['smtp_from']}"""
+ else:
+ smtp_block = """\
+# Email not configured — re-run setup.py to add SMTP
+SMTP_HOST=
+SMTP_PORT=587
+SMTP_USER=
+SMTP_PASSWORD=
+SMTP_USE_TLS=true
+SMTP_FROM_ADDRESS=noreply@example.com"""
+
+ content = f"""\
+# ============================================================
+# TOD Production Environment — generated by setup.py
+# Generated: {ts}
+# ============================================================
+
+# --- Database ---
+POSTGRES_DB={db}
+POSTGRES_USER=postgres
+POSTGRES_PASSWORD={pg_pw}
+DATABASE_URL=postgresql+asyncpg://postgres:{pg_pw}@postgres:5432/{db}
+SYNC_DATABASE_URL=postgresql+psycopg2://postgres:{pg_pw}@postgres:5432/{db}
+APP_USER_DATABASE_URL=postgresql+asyncpg://app_user:{app_pw}@postgres:5432/{db}
+POLLER_DATABASE_URL=postgres://poller_user:{poll_pw}@postgres:5432/{db}
+
+# --- Security ---
+JWT_SECRET_KEY={config['jwt_secret']}
+CREDENTIAL_ENCRYPTION_KEY={config['encryption_key']}
+
+# --- OpenBao (KMS) ---
+OPENBAO_ADDR=http://openbao:8200
+OPENBAO_TOKEN=PLACEHOLDER_RUN_SETUP
+BAO_UNSEAL_KEY=PLACEHOLDER_RUN_SETUP
+
+# --- Admin Bootstrap ---
+FIRST_ADMIN_EMAIL={config['admin_email']}
+FIRST_ADMIN_PASSWORD={config['admin_password']}
+
+# --- Email ---
+{smtp_block}
+
+# --- Web ---
+APP_BASE_URL={config['app_base_url']}
+CORS_ORIGINS={config['cors_origins']}
+
+# --- Application ---
+ENVIRONMENT=production
+LOG_LEVEL=info
+DEBUG=false
+APP_NAME=TOD - The Other Dude
+
+# --- Storage ---
+GIT_STORE_PATH=/data/git-store
+FIRMWARE_CACHE_DIR=/data/firmware-cache
+WIREGUARD_CONFIG_PATH=/data/wireguard
+WIREGUARD_GATEWAY=wireguard
+CONFIG_RETENTION_DAYS=90
+
+# --- Redis & NATS ---
+REDIS_URL=redis://redis:6379/0
+NATS_URL=nats://nats:4222
+
+# --- Poller ---
+POLL_INTERVAL_SECONDS=60
+CONNECTION_TIMEOUT_SECONDS=10
+COMMAND_TIMEOUT_SECONDS=30
+
+# --- Remote Access ---
+TUNNEL_PORT_MIN=49000
+TUNNEL_PORT_MAX=49100
+TUNNEL_IDLE_TIMEOUT=300
+SSH_RELAY_PORT=8080
+SSH_IDLE_TIMEOUT=900
+
+# --- Config Backup ---
+CONFIG_BACKUP_INTERVAL=21600
+CONFIG_BACKUP_MAX_CONCURRENT=10
+"""
+
+ ENV_PROD.write_text(content)
+ ok(f"Wrote {ENV_PROD.name}")
+
+
+def write_init_sql_prod(config: dict) -> None:
+ """Generate init-postgres-prod.sql with production passwords."""
+ app_pw = config["app_user_password"]
+ poll_pw = config["poller_user_password"]
+ db = config["postgres_db"]
+
+ content = f"""\
+-- Production database init — generated by setup.py
+-- Passwords match those in .env.prod
+
+DO $$
+BEGIN
+ IF NOT EXISTS (SELECT FROM pg_catalog.pg_roles WHERE rolname = 'app_user') THEN
+ CREATE ROLE app_user WITH LOGIN PASSWORD '{app_pw}' NOSUPERUSER NOCREATEDB NOCREATEROLE;
+ END IF;
+END
+$$;
+
+GRANT CONNECT ON DATABASE {db} TO app_user;
+GRANT USAGE ON SCHEMA public TO app_user;
+
+DO $$
+BEGIN
+ IF NOT EXISTS (SELECT FROM pg_catalog.pg_roles WHERE rolname = 'poller_user') THEN
+ CREATE ROLE poller_user WITH LOGIN PASSWORD '{poll_pw}' NOSUPERUSER NOCREATEDB NOCREATEROLE BYPASSRLS;
+ END IF;
+END
+$$;
+
+GRANT CONNECT ON DATABASE {db} TO poller_user;
+GRANT USAGE ON SCHEMA public TO poller_user;
+"""
+
+ INIT_SQL_PROD.write_text(content)
+ ok(f"Wrote {INIT_SQL_PROD.name}")
+```
+
+- [ ] **Step 3: Verify script still loads**
+
+Run: `cd /Volumes/ssd01/v9/the-other-dude && python3 -c "import setup; print('OK')"`
+Expected: `OK`
+
+- [ ] **Step 4: Commit**
+
+```bash
+git add setup.py
+git commit -m "feat(setup): add wizard sections and env file generation"
+```
+
+---
+
+## Chunk 4: Setup Script — OpenBao Bootstrap, Build, Start, Health Check
+
+### Task 5: Add OpenBao bootstrap, image builds, stack start, and health checks
+
+**Files:**
+- Modify: `setup.py`
+
+- [ ] **Step 1: Add OpenBao bootstrap function**
+
+Append to `setup.py`:
+
+```python
+# ── Docker operations ────────────────────────────────────────────────────────
+
+def run_compose(*args, check: bool = True, capture: bool = False,
+ timeout: int = 600) -> subprocess.CompletedProcess:
+ """Run a docker compose command with the prod overlay."""
+ cmd = COMPOSE_CMD + ["--env-file", str(ENV_PROD)] + list(args)
+ return subprocess.run(
+ cmd,
+ capture_output=capture,
+ text=True,
+ timeout=timeout,
+ check=check,
+ cwd=PROJECT_ROOT,
+ )
+
+
+def bootstrap_openbao(config: dict) -> bool:
+ """Start OpenBao, capture credentials, update .env.prod."""
+ section("OpenBao Bootstrap")
+ info("Starting PostgreSQL and OpenBao containers...")
+
+ try:
+ run_compose("up", "-d", "postgres", "openbao")
+ except subprocess.CalledProcessError as e:
+ fail("Failed to start OpenBao containers.")
+ info(str(e))
+ return False
+
+ info("Waiting for OpenBao to initialize (up to 60s)...")
+
+ # Wait for the container to be healthy
+ deadline = time.time() + 60
+ healthy = False
+ while time.time() < deadline:
+ result = subprocess.run(
+ ["docker", "inspect", "--format", "{{.State.Health.Status}}", "tod_openbao"],
+ capture_output=True, text=True, timeout=10,
+ )
+ status = result.stdout.strip()
+ if status == "healthy":
+ healthy = True
+ break
+ time.sleep(2)
+
+ if not healthy:
+ fail("OpenBao did not become healthy within 60 seconds.")
+ warn("Your .env.prod has placeholder tokens. To fix manually:")
+ info(" docker compose logs openbao")
+ info(" Look for BAO_UNSEAL_KEY and OPENBAO_TOKEN lines")
+ info(" Update .env.prod with those values")
+ return False
+
+ ok("OpenBao is healthy")
+
+ # Parse credentials from container logs
+ info("Capturing OpenBao credentials from logs...")
+ result = subprocess.run(
+ ["docker", "compose", "-f", COMPOSE_BASE, "-f", COMPOSE_PROD, "logs", "openbao"],
+ capture_output=True, text=True, timeout=30, cwd=PROJECT_ROOT,
+ )
+
+ logs = result.stdout + result.stderr
+ unseal_match = re.search(r"BAO_UNSEAL_KEY=(\S+)", logs)
+ token_match = re.search(r"OPENBAO_TOKEN=(\S+)", logs)
+
+ if unseal_match and token_match:
+ unseal_key = unseal_match.group(1)
+ root_token = token_match.group(1)
+
+ # Update .env.prod
+ env_content = ENV_PROD.read_text()
+ env_content = env_content.replace("OPENBAO_TOKEN=PLACEHOLDER_RUN_SETUP",
+ f"OPENBAO_TOKEN={root_token}")
+ env_content = env_content.replace("BAO_UNSEAL_KEY=PLACEHOLDER_RUN_SETUP",
+ f"BAO_UNSEAL_KEY={unseal_key}")
+ ENV_PROD.write_text(env_content)
+
+ ok("OpenBao credentials captured and saved to .env.prod")
+ info(f"OPENBAO_TOKEN={mask_secret(root_token)}")
+ info(f"BAO_UNSEAL_KEY={mask_secret(unseal_key)}")
+ return True
+ else:
+ # OpenBao was already initialized — check if .env.prod has real values
+ env_content = ENV_PROD.read_text()
+ if "PLACEHOLDER_RUN_SETUP" in env_content:
+ warn("Could not find credentials in logs (OpenBao may already be initialized).")
+ warn("Check 'docker compose logs openbao' and update .env.prod manually.")
+ return False
+ else:
+ ok("OpenBao already initialized — existing credentials in .env.prod")
+ return True
+```
+
+- [ ] **Step 2: Add image build function**
+
+Append to `setup.py`:
+
+```python
+def build_images() -> bool:
+ """Build Docker images one at a time to avoid OOM."""
+ section("Building Images")
+ info("Building images sequentially to avoid memory issues...")
+ print()
+
+ services = ["api", "poller", "frontend", "winbox-worker"]
+
+ for i, service in enumerate(services, 1):
+ info(f"[{i}/{len(services)}] Building {service}...")
+ try:
+ run_compose("build", service, timeout=900)
+ ok(f"{service} built successfully")
+ except subprocess.CalledProcessError:
+ fail(f"Failed to build {service}")
+ print()
+ warn("To retry this build:")
+ info(f" docker compose -f {COMPOSE_BASE} -f {COMPOSE_PROD} build {service}")
+ return False
+ except subprocess.TimeoutExpired:
+ fail(f"Build of {service} timed out (15 min)")
+ return False
+
+ print()
+ ok("All images built successfully")
+ return True
+```
+
+- [ ] **Step 3: Add stack start and health check functions**
+
+Append to `setup.py`:
+
+```python
+def start_stack() -> bool:
+ """Start the full stack."""
+ section("Starting Stack")
+ info("Bringing up all services...")
+
+ try:
+ run_compose("up", "-d")
+ ok("Stack started")
+ return True
+ except subprocess.CalledProcessError as e:
+ fail("Failed to start stack")
+ info(str(e))
+ return False
+
+
+def health_check(config: dict) -> None:
+ """Poll service health for up to 60 seconds."""
+ section("Health Check")
+ info("Checking service health (up to 60s)...")
+ print()
+
+ services = [
+ ("tod_postgres", "PostgreSQL"),
+ ("tod_redis", "Redis"),
+ ("tod_nats", "NATS"),
+ ("tod_openbao", "OpenBao"),
+ ("tod_api", "API"),
+ ("tod_poller", "Poller"),
+ ("tod_frontend", "Frontend"),
+ ("tod_winbox_worker", "WinBox Worker"),
+ ]
+
+ deadline = time.time() + 60
+ pending = dict(services)
+
+ while pending and time.time() < deadline:
+ for container, label in list(pending.items()):
+ try:
+ result = subprocess.run(
+ ["docker", "inspect", "--format",
+ "{{if .State.Health}}{{.State.Health.Status}}{{else}}{{.State.Status}}{{end}}",
+ container],
+ capture_output=True, text=True, timeout=5,
+ )
+ status = result.stdout.strip()
+ if status in ("healthy", "running"):
+ ok(f"{label}: {status}")
+ del pending[container]
+ except Exception:
+ pass
+
+ if pending:
+ time.sleep(3)
+
+ for container, label in pending.items():
+ fail(f"{label}: not healthy")
+ info(f" Check logs: docker compose logs {container.replace('tod_', '')}")
+
+ # Final summary
+ print()
+ if not pending:
+ banner("Setup Complete!")
+ print(f" {bold('Access your instance:')}")
+ print(f" URL: {green(config['app_base_url'])}")
+ print(f" Email: {config['admin_email']}")
+ if config.get("admin_password_generated"):
+ print(f" Password: {bold(config['admin_password'])}")
+ else:
+ print(f" Password: (the password you entered)")
+ print()
+ info("Change the admin password after your first login.")
+ else:
+ warn("Some services are not healthy. Check the logs above.")
+ info(f" docker compose -f {COMPOSE_BASE} -f {COMPOSE_PROD} logs")
+```
+
+- [ ] **Step 4: Add the main function and signal handler**
+
+Append to `setup.py`:
+
+```python
+# ── Main ─────────────────────────────────────────────────────────────────────
+
+def main() -> int:
+ # Graceful Ctrl+C
+ env_written = False
+
+ def handle_sigint(sig, frame):
+ nonlocal env_written
+ print()
+ if not env_written:
+ info("Aborted before writing .env.prod — no files changed.")
+ else:
+ warn(f".env.prod was already written to {ENV_PROD}")
+ info("OpenBao tokens may still be placeholders if bootstrap didn't complete.")
+ sys.exit(1)
+
+ signal.signal(signal.SIGINT, handle_sigint)
+
+ os.chdir(PROJECT_ROOT)
+
+ # Phase 1: Pre-flight
+ if not preflight():
+ return 1
+
+ # Phase 2: Wizard
+ config: dict = {}
+ wizard_database(config)
+ wizard_security(config)
+ wizard_admin(config)
+ wizard_email(config)
+ wizard_domain(config)
+
+ # Summary
+ if not show_summary(config):
+ info("Setup cancelled.")
+ return 1
+
+ # Phase 3: Write files
+ section("Writing Configuration")
+ write_env_prod(config)
+ write_init_sql_prod(config)
+ env_written = True
+
+ # Phase 4: OpenBao
+ bao_ok = bootstrap_openbao(config)
+
+ # Phase 5: Build
+ if not build_images():
+ warn("Fix the build error and re-run setup.py to continue.")
+ return 1
+
+ # Phase 6: Start
+ if not start_stack():
+ return 1
+
+ # Phase 7: Health
+ health_check(config)
+
+ return 0
+
+
+if __name__ == "__main__":
+ sys.exit(main())
+```
+
+- [ ] **Step 5: Verify complete script loads and has main**
+
+Run: `cd /Volumes/ssd01/v9/the-other-dude && python3 -c "from setup import main; print('OK')"`
+Expected: `OK`
+
+- [ ] **Step 6: Commit**
+
+```bash
+git add setup.py
+git commit -m "feat(setup): add OpenBao bootstrap, builds, start, and health checks"
+```
+
+---
+
+## Chunk 5: Docker Compose — Mount Production Init SQL
+
+### Task 6: Mount init-postgres-prod.sql in production compose
+
+**Files:**
+- Modify: `docker-compose.prod.yml`
+
+- [ ] **Step 1: Add postgres volume override for prod init SQL**
+
+In `docker-compose.prod.yml`, add a `postgres` service override to mount the production init SQL instead of the dev one. Add before the `api` service (or anywhere in services):
+
+```yaml
+ postgres:
+ volumes:
+ - ./docker-data/postgres:/var/lib/postgresql/data
+ - ./scripts/init-postgres-prod.sql:/docker-entrypoint-initdb.d/init.sql:ro
+ healthcheck:
+ test: ["CMD-SHELL", "pg_isready -U postgres -d ${POSTGRES_DB:-tod}"]
+ interval: 5s
+ timeout: 5s
+ retries: 5
+```
+
+This overrides the base `docker-compose.yml` volumes for postgres, mounting the prod init SQL with generated passwords instead of the dev one with hardcoded passwords.
+
+- [ ] **Step 2: Commit**
+
+```bash
+git add docker-compose.prod.yml
+git commit -m "feat(setup): mount production init SQL and use env var for healthcheck"
+```
+
+---
+
+## Chunk 6: Final Integration
+
+### Task 7: End-to-end verification
+
+- [ ] **Step 1: Verify all modified files are consistent**
+
+Run these checks:
+```bash
+# Ensure no remaining 'mikrotik' references in key files
+grep -r "mikrotik" docker-compose.yml docker-compose.prod.yml scripts/init-postgres.sql .env.example
+# Expected: no output
+
+# Verify setup.py syntax
+python3 -m py_compile setup.py
+# Expected: no output (success)
+
+# Verify login.tsx has the DEV guard
+grep -A2 "import.meta.env.DEV" frontend/src/routes/login.tsx
+# Expected: shows the DEV-gated hint block
+```
+
+- [ ] **Step 2: Verify no remaining mikrotik references anywhere**
+
+```bash
+grep -r "mikrotik" docker-compose*.yml scripts/init-postgres.sql .env.example .env.staging.example 2>/dev/null || echo "All clear"
+```
+
+Expected: `All clear` (no output from grep)
diff --git a/docs/superpowers/specs/2026-03-14-setup-script-design.md b/docs/superpowers/specs/2026-03-14-setup-script-design.md
index 2d292fd..68bc1cf 100644
--- a/docs/superpowers/specs/2026-03-14-setup-script-design.md
+++ b/docs/superpowers/specs/2026-03-14-setup-script-design.md
@@ -181,7 +181,7 @@ CONFIG_BACKUP_MAX_CONCURRENT=10
### Phase 4: OpenBao Bootstrap
1. Start postgres and openbao containers only: `docker compose -f docker-compose.yml -f docker-compose.prod.yml --env-file .env.prod up -d postgres openbao`
-2. Wait for openbao container to be healthy (timeout 30s)
+2. Wait for openbao container to be healthy (timeout 60s)
3. Run `docker compose logs openbao 2>&1` and parse the `OPENBAO_TOKEN=` and `BAO_UNSEAL_KEY=` lines using regex (init.sh prints these to stdout during container startup, which is captured in Docker logs)
4. Update `.env.prod` by replacing the `PLACEHOLDER_RUN_SETUP` values with the captured credentials
5. On failure: `.env.prod` retains placeholders, print instructions for manual capture via `docker compose logs openbao`