ARCHIVED 2026-03-24 — Migration completed. See
operations/infrastructure-setup-plan.mdPhase D for results.
Baserow Cloud → Self-Hosted Migration Continuation Prompt
Purpose: Copy-paste this into a new Claude Code session to migrate data from Baserow Cloud to the self-hosted instance and update all scripts.
The Prompt
We're continuing the Solanasis infrastructure work. The tunnel and API setup are complete — now we need to migrate data from Baserow Cloud to our self-hosted instance and update scripts to use the new instance.
## FIRST: Read the planning docs
Before doing ANYTHING, read these files to get full context:
1. `_solanasis/solanasis-docs/operations/infrastructure-setup-plan.md` — Completed infra plan
2. `_solanasis/solanasis-docs/operations/service-inventory.md` — Service inventory with secrets
3. `_solanasis/solanasis-docs/operations/credential-setup-guide.md` — Credential guide (includes Baserow API section)
4. `_solanasis/solanasis-scripts/.env` — Current env vars (has self-hosted Baserow creds)
5. `_solanasis/solanasis-scripts/BASEROW-MIGRATION-PLAN.md` — Original cloud migration plan (reference for table structures)
## CONTEXT: What's already done
### Infrastructure (2026-03-22 to 2026-03-24)
- ERPNext at https://erp.solanasis.com — running, API verified
- Baserow at https://baserow.solanasis.com — running, admin account created (ds@solanasis.com)
- Cloudflare Tunnel `solanasis` (ID: c57c4d9c-e408-4e43-b0d5-190baa70568e) active
- cloudflared updated to 2026.3.0
- All old tunnels deleted
### Baserow API Access (2026-03-24)
- DB Token `claude-bot` created with full CRUD permissions
- `.env` updated: BASEROW_BASE_URL=https://baserow.solanasis.com, BASEROW_DB_TOKEN, BASEROW_EMAIL=ds@solanasis.com, BASEROW_DATABASE_ID=54
- GitHub Secrets updated on solanasis-scripts: BASEROW_BASE_URL, BASEROW_DB_TOKEN, BASEROW_DATABASE_ID, BASEROW_EMAIL
- Self-hosted Baserow workspace ID: 45, database ID: 54
- Currently has 1 placeholder table ("Dmitri Sunshine's table", ID: 263)
### API access pattern (IMPORTANT — localhost requires Host header)
- From server: `http://localhost:8880/api/...` with `Host: baserow.solanasis.com` header
- Cloudflare blocks automated requests through the tunnel URL (error 1010)
- DB Token auth: `Authorization: Token {BASEROW_DB_TOKEN}` (row CRUD only)
- JWT auth: POST to `/api/user/token-auth/` with email/password (for schema operations)
## WHAT NEEDS TO BE DONE
### Task 1: Update scripts to use BASEROW_BASE_URL
These files hardcode `https://api.baserow.io` and need to read from `BASEROW_BASE_URL` env var instead:
**Python scripts (use `os.environ.get("BASEROW_BASE_URL", "https://baserow.solanasis.com")`):**
- `_solanasis/solanasis-scripts/fcto-pipeline/migrate_to_baserow.py` (~line 64)
- `_solanasis/solanasis-scripts/msp-pipeline/migrate_to_baserow.py` (~line 52)
- `_solanasis/solanasis-scripts/foundation-pipeline/migrate_to_baserow.py` (~line 48)
- `_solanasis/solanasis-scripts/foundation-pipeline/enrich_emails.py`
- `_solanasis/solanasis-scripts/foundation-pipeline/add_outreach_columns.py`
- `_solanasis/solanasis-scripts/fcto-pipeline/regenerate_mailto.py`
- `_solanasis/solanasis-scripts/msp-pipeline/regenerate_mailto.py`
**TypeScript client (already supports custom baseUrl param, but ensure env fallback):**
- `_solanasis/solanasis-scripts/baserow/src/lib/baserow-client.ts` (~line 24)
- `_solanasis/solanasis-scripts/baserow/import-contacts.ts` (~line 34)
**IMPORTANT:** When scripts run on the server (localhost), they should use `http://localhost:8880` with `Host: baserow.solanasis.com` header for reliability. When running elsewhere, use `https://baserow.solanasis.com`. Consider adding a `BASEROW_HOST_HEADER` env var or auto-detecting localhost.
### Task 2: Migrate data from Baserow Cloud
The Baserow Cloud instance has these tables that need to be exported and re-imported:
**Personal CRM database (cloud ID: 387807):**
| Table | Cloud ID | Rows | Fields |
|---|---|---|---|
| Tag | — | 51 | 2 (Name, Notes) |
| Location | — | 33 | 3 (Name, State, Notes) |
| Organization | — | 66 | 9 (includes link_row to Tag, Location) |
| People | — | 160 | 19 (includes link_row to Tag, Location, Organization) |
| Meeting Notes | — | 37 | 5 (includes link_row to People) |
**Pipeline tables (may be in separate databases):**
| Table | Rows | Notes |
|---|---|---|
| fCTO Partners | ~50+ | 24 fields, tier scoring |
| MSP Prospects | ~100+ | Geographic + RIA density scoring |
| Foundation Prospects | ~200+ | Tax ID, financials, NTEE codes |
**Migration approach options:**
- **Option A (Recommended):** Export from cloud as CSV/JSON, create schema on self-hosted via JWT API, import data. The migration scripts already handle schema creation — they just need the URL updated.
- **Option B:** Re-run the original migration scripts against the self-hosted instance (they read from CSV files in CSV_SOURCE_DIR). Only works for tables that were originally migrated from CSV.
- **Option C:** Use Baserow's built-in export/import if available in self-hosted.
**Key challenge:** `link_row` fields (foreign keys between tables) use row IDs. When data is re-imported, row IDs change, so link relationships break. The migration scripts handle this by creating lookup tables and linking by name — re-running them should work.
### Task 3: Verify and clean up
After migration:
1. Verify row counts match between cloud and self-hosted
2. Verify link_row relationships are intact
3. Delete the placeholder "Dmitri Sunshine's table" (ID: 263)
4. Rename workspace/database if desired ("Dmitri Sunshine's workspace" → "Solanasis")
5. Update any remaining references to Baserow Cloud in docs
6. Commit all script changes and push
### Task 4 (Optional): Cloudflare WAF exception for API
If scripts need to call through the tunnel URL instead of localhost:
- Add a Cloudflare WAF rule to skip bot protection for `/api/*` paths on `baserow.solanasis.com`
- Or use a custom header + WAF bypass rule
- This is only needed if scripts run from machines OTHER than the server
## APPROACH
1. Read all planning docs first
2. Update Python/TypeScript scripts to use BASEROW_BASE_URL
3. Verify updated scripts can connect to self-hosted instance
4. Plan the data migration (determine which option for each table set)
5. Execute migration
6. Verify data integrity
7. Clean up and commit
## KEY FILE PATHS
- Scripts .env: `_solanasis/solanasis-scripts/.env`
- TypeScript client: `_solanasis/solanasis-scripts/baserow/src/lib/baserow-client.ts`
- Python migration scripts: `_solanasis/solanasis-scripts/{fcto,msp,foundation}-pipeline/migrate_to_baserow.py`
- Migration plan reference: `_solanasis/solanasis-scripts/BASEROW-MIGRATION-PLAN.md`
- Operations docs: `_solanasis/solanasis-docs/operations/`
## GITHUB REPOS (all under dzinreach, all private)
solanasis-scripts (main repo for this task), solanasis-docs