This guide installs and runs Draft’n run locally with Docker services, Supabase, and the backend.
1) Start infrastructure services
From the repository root:
Terminal
bash Copycd services
docker compose up -dcd services
docker compose up -dThis launches PostgreSQL, Redis, Qdrant, SeaweedFS, and (optionally) Prometheus/Tempo/Grafana.
2) Configure SeaweedFS S3
Create config/seaweedfs/s3_config.json from s3_config.json.example and set access keys. These will be reused in credentials.env.
3) Supabase (local)
From the repo root:
Terminal
bash Copysupabase start
supabase statussupabase start
supabase statusCopy the anon and service role keys. Then run the Edge Functions runtime:
Terminal
bash Copysupabase functions servesupabase functions serve4) Backend environment
Create credentials.env at the repo root (copy credentials.env.example) and fill variables per Configuration page. Minimal local values:
credentials.env
env CopyADA_DB_URL=postgresql://postgres:ada_password@localhost:5432/ada_backend
INGESTION_DB_URL=postgresql://postgres:ada_password@localhost:5432/ada_ingestion
TRACES_DB_URL=postgresql://postgres:ada_password@localhost:5432/ada_traces
QDRANT_CLUSTER_URL=http://localhost:6333
QDRANT_API_KEY=secret_api_key
S3_ENDPOINT_URL=http://localhost:8333
S3_ACCESS_KEY_ID=your_s3_access_key_id
S3_SECRET_ACCESS_KEY=your_s3_secret_access_key
S3_BUCKET_NAME=s3-backend
S3_REGION_NAME=us-east-1
SUPABASE_PROJECT_URL=http://localhost:54321
SUPABASE_PROJECT_KEY=your_anon_key
SUPABASE_SERVICE_ROLE_SECRET_KEY=your_service_role_key
SUPABASE_USERNAME=you@example.com
SUPABASE_PASSWORD=your_password
SUPABASE_BUCKET_NAME=ada-backend
ADA_URL=http://localhost:8000ADA_DB_URL=postgresql://postgres:ada_password@localhost:5432/ada_backend
INGESTION_DB_URL=postgresql://postgres:ada_password@localhost:5432/ada_ingestion
TRACES_DB_URL=postgresql://postgres:ada_password@localhost:5432/ada_traces
QDRANT_CLUSTER_URL=http://localhost:6333
QDRANT_API_KEY=secret_api_key
S3_ENDPOINT_URL=http://localhost:8333
S3_ACCESS_KEY_ID=your_s3_access_key_id
S3_SECRET_ACCESS_KEY=your_s3_secret_access_key
S3_BUCKET_NAME=s3-backend
S3_REGION_NAME=us-east-1
SUPABASE_PROJECT_URL=http://localhost:54321
SUPABASE_PROJECT_KEY=your_anon_key
SUPABASE_SERVICE_ROLE_SECRET_KEY=your_service_role_key
SUPABASE_USERNAME=you@example.com
SUPABASE_PASSWORD=your_password
SUPABASE_BUCKET_NAME=ada-backend
ADA_URL=http://localhost:8000Generate secrets:
Terminal
bash Copyuv venv && source .venv/bin/activate
uv sync
uv run python -c "import secrets; print(secrets.token_hex(32))" # BACKEND_SECRET_KEY
uv run python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())" # FERNET_KEY
uv run python -c "from ada_backend.services.api_key_service import _generate_api_key, _hash_key; key=_generate_api_key(); print('INGESTION_API_KEY=',key); print('INGESTION_API_KEY_HASHED=',_hash_key(key))"uv venv && source .venv/bin/activate
uv sync
uv run python -c "import secrets; print(secrets.token_hex(32))" # BACKEND_SECRET_KEY
uv run python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())" # FERNET_KEY
uv run python -c "from ada_backend.services.api_key_service import _generate_api_key, _hash_key; key=_generate_api_key(); print('INGESTION_API_KEY=',key); print('INGESTION_API_KEY_HASHED=',_hash_key(key))"5) Initialize databases
Terminal
bash Copymake db-upgrade
make db-seed
make trace-db-upgrademake db-upgrade
make db-seed
make trace-db-upgrade6) Run backend and worker
Terminal
bash Copymake run-draftnrun-agents-backend
# in another terminal
uv run python -m ada_ingestion_system.worker.mainmake run-draftnrun-agents-backend
# in another terminal
uv run python -m ada_ingestion_system.worker.mainOpen:
- Swagger:
http://localhost:8000/docs - Admin:
http://localhost:8000/admin - Grafana (if enabled):
http://localhost:3000
Quick Start tutorial
Ready to build production AI?
Start with the Quick Start guide or explore the API. Join the community and help shape the open-source platform for AI agents.