← Back to Overview | Git Workflow →
This document provides the step-by-step technical implementation for exporting a Supabase project from one account and deploying it to a new Supabase account.
- Install the latest Supabase CLI:
npm install -g supabase(or use brew on macOS:brew install supabase/tap/supabase) - Log in to both Supabase accounts (old and new) in your terminal:
supabase login # with old account access token first # Then switch accounts later with another `supabase login` when needed - Clone your existing Next.js repo (the one that already has the frontend + possibly partial Supabase config):
git clone <your-repo-url> my-project cd my-project
This pulls schema (tables, RLS, functions, triggers), Edge Functions, and makes everything version-controlled.
# 1. Initialize Supabase folder structure if it doesn't exist yet
supabase init
# 2. Link to the OLD project (you need the project ref from dashboard URL: https://app.supabase.com/project/<PROJECT_REF>)
supabase link --project-ref <OLD_PROJECT_REF>
# 3. Pull the full remote schema (including auth/storage changes if any)
supabase db pull
# This creates a migration file like supabase/migrations/<timestamp>_remote_schema.sql with everything
# 4. Pull Edge Functions (if you have any in supabase/functions/)
supabase functions download --project-ref <OLD_PROJECT_REF>
# 5. Pull secrets / env vars (e.g. third-party API keys, SMTP, etc.)
supabase secrets list --project-ref <OLD_PROJECT_REF> > old_secrets.txt
# Manually copy the ones you need and later set them on the new projectCommit everything:
git add supabase/
git commit -m "chore: pull full Supabase schema and functions from old project"
git push# Get the old project's DB connection string (from Dashboard > Settings > Database > Connection string (click "URI"))
# Or use the password version
# Dump schema + data (excludes Supabase internal stuff)
supabase db dump --linked -f full_dump_with_data.sql --data-only --use-copy
# If you want schema + data in one file:
supabase db dump --linked -f full_dump.sqlThis file can be huge — keep it out of git (add to .gitignore).
Supabase does not have a CLI command for this yet, so use the official JS script (run from your repo root):
npm install @supabase/supabase-jsCreate migrate-storage.js:
// migrate-storage.js
const { createClient } = require('@supabase/supabase-js');
const OLD_URL = 'https://<OLD_PROJECT_REF>.supabase.co';
const OLD_SERVICE_KEY = '<old-service-role-key>'; // Dashboard > Settings > API > service_role
const NEW_URL = 'https://<NEW_PROJECT_REF>.supabase.co'; // You can create new project first
const NEW_SERVICE_KEY = '<new-service-role-key>';
const oldClient = createClient(OLD_URL, OLD_SERVICE_KEY);
const newClient = createClient(NEW_URL, NEW_SERVICE_KEY);
async function migrateStorage() {
const { data: buckets } = await oldClient.storage.listBuckets();
for (const bucket of buckets) {
await newClient.storage.createBucket(bucket.name, { public: bucket.public });
const { data: objects } = await oldClient.storage.from(bucket.name).list('', { limit: 1000, offset: 0 });
async function downloadAndUpload(path = '') {
const { data } = await oldClient.storage.from(bucket.name).list(path, { limit: 1000 });
for (const item of data) {
if (item.name === '.') continue;
const fullPath = path ? `${path}/${item.name}` : item.name;
if (item.id === null) { // it's a folder
await downloadAndUpload(fullPath);
} else {
const { data: file } = await oldClient.storage.from(bucket.name).download(fullPath);
await newClient.storage.from(bucket.name).upload(fullPath, file, { upsert: true });
console.log(`Copied ${fullPath}`);
}
}
}
await downloadAndUpload();
}
console.log('Storage migration complete!');
}
migrateStorage().catch(console.error);Run it after you create the new project:
node migrate-storage.js# Log in with your new/personal account if needed
supabase login # use your access token
# Create the new project via dashboard (CLI cannot create projects yet)
# Go to https://app.supabase.com → New project → note the new PROJECT_REFThen link your repo to it:
supabase link --project-ref <NEW_PROJECT_REF># Push schema + functions
supabase db push # applies all migrations (including the big remote_schema one)
supabase functions deploy --project-ref <NEW_PROJECT_REF> # if you have functions
# Re-set secrets on the new project
supabase secrets set KEY1=value1 KEY2=value2 --project-ref <NEW_PROJECT_REF>
# Or from the old_secrets.txt you exported earlier
# Restore data
supabase db reset --linked # wipes new DB first (safe on fresh project)
psql $(supabase db url --project-ref <NEW_PROJECT_REF>) < full_dump_with_data.sql
# Or if you dumped everything in one file:
psql $(supabase db url --project-ref <NEW_PROJECT_REF>) < full_dump.sql# Install Vercel CLI if you don't have it
npm i -g vercel
# Log in to Vercel (your account)
vercel login
# Create & deploy a fresh Vercel project (not importing the old one)
vercel # it will ask questions → choose "Create a new project"
# Update env vars on Vercel (Dashboard > Project Settings > Environment Variables)
# Add:
NEXT_PUBLIC_SUPABASE_URL=https://<NEW_PROJECT_REF>.supabase.co
NEXT_PUBLIC_SUPABASE_ANON_KEY=<new-anon-public-key>
# Plus any other vars you hadDeploy:
git push origin main # if you made changes
vercel --prod # force fresh deploy- Update any hard-coded old project refs in code (search/replace)
- Test auth signup/login (auth users were migrated via data dump)
- Test storage uploads (buckets recreated by script, objects copied)
- Test Edge Functions
- Point your domain/custom domain to the new Vercel project
You now have a completely independent copy under your control, fully CLI-driven, and everything (schema, functions, secrets) is version-controlled in git. Let me know which step you want to dive deeper into!