Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
93 changes: 93 additions & 0 deletions .github/workflows/studio-e2e-test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
name: Studio E2E Tests
on:
push:
branches: [master]
paths:
- 'packages/pg-meta/**/*'
- 'apps/studio/**'
- 'e2e/studio/**'
- 'pnpm-lock.yaml'
pull_request:
paths:
- 'packages/pg-meta/**/*'
- 'apps/studio/**'
- 'e2e/studio/**'
- 'pnpm-lock.yaml'
- '.github/workflows/studio-e2e-test.yml'

# Cancel old builds on new commit for same workflow + branch/PR
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true

permissions:
contents: write

jobs:
test:
timeout-minutes: 60
runs-on: ubuntu-latest
# Make the job non-blocking
continue-on-error: true
# Require approval only for external contributors
environment: ${{ github.event.pull_request.author_association != 'MEMBER' && 'Studio E2E Tests' || '' }}

env:
EMAIL: ${{ secrets.CI_EMAIL }}
PASSWORD: ${{ secrets.CI_PASSWORD }}
PROJECT_REF: ${{ secrets.CI_PROJECT_REF }}
NEXT_PUBLIC_IS_PLATFORM: true
NEXT_PUBLIC_API_URL: https://api.supabase.green
VERCEL_ORG_ID: ${{ secrets.VERCEL_ORG_ID }}
VERCEL_PROJECT_ID: ${{ secrets.VERCEL_STUDIO_HOSTED_PROJECT_ID }}
NEXT_PUBLIC_HCAPTCHA_SITE_KEY: 10000000-ffff-ffff-ffff-000000000001

steps:
- uses: actions/checkout@v4
- uses: pnpm/action-setup@v4
name: Install pnpm
with:
run_install: false
- name: Use Node.js
uses: actions/setup-node@v4
with:
node-version-file: '.nvmrc'
cache: 'pnpm'

- name: Install dependencies
run: pnpm i

- name: Install Vercel CLI
run: pnpm add --global vercel@latest

- name: Pull Vercel Environment Information (Preview)
run: vercel pull --yes --environment=preview --token=${{ secrets.VERCEL_TOKEN }}

- name: Build Project Artifacts for Vercel
run: vercel build --token=${{ secrets.VERCEL_TOKEN }}

- name: Deploy Project to Vercel and Get URL
id: deploy_vercel
run: |
DEPLOY_URL=$(vercel deploy --prebuilt --token=${{ secrets.VERCEL_TOKEN }})
echo "Vercel Preview URL: $DEPLOY_URL"
echo "DEPLOY_URL=$DEPLOY_URL" >> $GITHUB_OUTPUT

- name: Install Playwright Browsers
run: pnpm -C e2e/studio exec playwright install --with-deps

- name: Run Playwright tests
id: playwright
env:
AUTHENTICATION: true
STUDIO_URL: ${{ steps.deploy_vercel.outputs.DEPLOY_URL }}/dashboard
run: pnpm e2e

- uses: actions/upload-artifact@v4
if: always()
with:
name: playwright-artifacts
path: |
e2e/studio/playwright-report/
e2e/studio/test-results/
retention-days: 7
19 changes: 9 additions & 10 deletions apps/docs/content/guides/auth/quickstarts/react.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -27,22 +27,21 @@ hideToc: true
</StepHikeCompact.Step>

<StepHikeCompact.Step step={2}>
<StepHikeCompact.Details title="Create a React app">

<StepHikeCompact.Details title="Create a React app">
Create a React app using [Vite](https://vitejs.dev/).

Create a React app using the `create-react-app` command.
</StepHikeCompact.Details>

</StepHikeCompact.Details>
<StepHikeCompact.Code>

<StepHikeCompact.Code>
```bash name=Terminal
npm create vite@latest my-app -- --template react

```bash name=Terminal
npx create-react-app my-app
```
```

</StepHikeCompact.Code>

</StepHikeCompact.Step>
</StepHikeCompact.Code>
</StepHikeCompact.Step>

<StepHikeCompact.Step step={3}>
<StepHikeCompact.Details title="Install the Supabase client library">
Expand Down
2 changes: 1 addition & 1 deletion apps/docs/content/guides/realtime/postgres-changes.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -1570,7 +1570,7 @@ supabase
schema: 'public',
table: 'colors',
filter: PostgresChangeFilter(
type: PostgresChangeFilterType.lte,
type: PostgresChangeFilterType.inFilter,
column: 'name',
value: ['red', 'blue', 'yellow'],
),
Expand Down
1 change: 1 addition & 0 deletions apps/docs/public/humans.txt
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,7 @@ Kamil Ogórek
Kang Ming Tay
Karan S
Karlo Ison
Katerina Skroumpelou
Kevin Brolly
Kevin Grüneberg
Lakshan Perera
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,153 @@
---
title: 'Persistent Storage and 97% Faster Cold Starts for Edge Functions'
description: 'Mount S3-compatible buckets as persistent file storage in Edge Functions with up to 97% faster cold start times.'
categories:
- product
- launch-week
- edge-functions
tags:
- launch-week
- edge-functions
- storage
date: '2025-07-18:00:00'
toc_depth: 3
author: laktek,nyannyacha
image: launch-week-15/day-5-persistent-storage-for-functions/og.jpg
thumb: launch-week-15/day-5-persistent-storage-for-functions/thumb.png
launchweek: '15'
---

Today, we are introducing Persistent Storage and up to 97% faster cold start times for Edge Functions. Previously, Edge Functions only supported ephemeral file storage by writing to `/tmp` directory. Many common libraries for performing tasks, such as zipping/unzipping files and image transformations, are built to work with persistent file storage, so making them work with Edge Functions required extra steps.

The persistent storage option is built on top of the S3 protocol. It allows you to mount any [S3-compatible bucket](https://supabase.com/docs/guides/storage/s3/compatibility), including [Supabase Storage Buckets](https://supabase.com/docs/guides/storage), as a directory for your Edge Functions. You can perform operations such as reading and writing files to the mounted buckets as you would in a POSIX file system.

```tsx
// read from S3 bucket
const data = await Deno.readFile('/s3/my-bucket/results.csv')

// make a directory
await Deno.mkdir('/s3/my-bucket/sub-dir')

// write to S3 bucket
await Deno.writeTextFile('/s3/my-bucket/demo.txt', 'hello world')
```

<div className="video-container mb-8">
<iframe
className="w-full"
src="https://www.youtube-nocookie.com/embed/h3mQrDC4g14"
title="Persistent Storage and Faster Boot Times for Edge Functions"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; fullscreen; gyroscope; picture-in-picture; web-share"
allowfullscreen
/>
</div>

## How to configure

To access an S3 bucket from Edge Functions, you must set the following as environment variables in Edge Function Secrets.

- `S3FS_ENDPOINT_URL`
- `S3FS_REGION`
- `S3FS_ACCESS_KEY_ID`
- `S3FS_SECRET_ACCESS_KEY`

If you are using Supabase Storage, [follow this guide](https://supabase.com/docs/guides/storage/s3/authentication) to enable and create an access key and id.

## Use Case: SQLite in Edge Functions

The S3 File System simplifies workflows that involve reading and transforming data stored in an S3 bucket.

For example, imagine you are building an IoT app where a device backs up its SQLite database to S3. You can set up a scheduled Edge Function to read this data and then push the data to your primary Postgres database for aggregates and reporting.

```tsx
// Following example is simplified for readability

import { DB } from "https://deno.land/x/[email protected]/mod.ts";
import { supabase } from '../shared/client.ts'

const today = new Date().toISOString().split('T')[0]
const backupDBPath = `backups/backup-${today}.db`

// Use S3 FS to read the Sqlite DB
const data = Deno.readFileSync(`/s3/${backupDBPath}`);

// Create an in-memory SQLite from the data downloaded from S3
// This is faster than directly reading from S3
const db = new DB();
db.deserialize(data);

function calculateStats(rows: IoTData[], date: string): StatsSummary {
// ....
}

Deno.serve(async (req)=>{
// Assuming IoT data is stored in a table called 'sensor_data'
const rows = db.queryEntries<IoTData>(`
SELECT * FROM sensor_data
WHERE date(timestamp) = date('now', 'localtime')
`)

// Calculate statistics
const stats = calculateStats(rows, today)

// Insert stats into Supabase
const { data, error } = await supabase
.from('iot_daily_stats')
.insert([stats])

return new Response("OK);
});

```

## 97% Faster Function Boot Times, Even Under Load

Previously, Edge Functions with large dependencies or doing preparation work at the start (e.g., parsing/loading configs, initializing AI models) would incur a noticeable boot delay. Sometimes, these slow neighbors can impact other functions running on the same machine. All JavaScript _workers_ in the Supabase Edge Functions Runtime were cooperatively scheduled on the same [**Tokio thread pool**](https://github.com/tokio-rs/tokio). If one worker had heavy startup logic, such as parsing JavaScript modules or running synchronous operations, it could delay every worker scheduled after. This led to occasional long‑tail latency spikes in high-traffic projects.

To address this issue, we moved workers which are still performing initial script evaluation onto a dedicated blocking pool. This approach prevents heavy initialization tasks from blocking the Tokio thread, significantly reducing boot time spikes for other functions.

### The result

Boot times are now more predictable and wait times for cold starts are now much faster. Here’s a result of a [benchmark](https://github.com/supabase/edge-runtime/blob/develop/k6/specs/mixed.ts) we did to compare boot times before and after these changes.

| Metric | Before | After | (Delta) |
| ---------------- | ---------- | --------- | --------- |
| **Avg** | **870 ms** | **42 ms** | **95 %** |
| **P95** | 8,502 ms | 86 ms | **99 %** |
| **P99** | 15,069 ms | 460 ms | **97 %** |
| **Worst** | 24,300 ms | 1 630 ms | **93 %** |
| **Spikes > 1 s** | 47 % | 4 % | **43 pp** |

## Support for Synchronous APIs

By offloading expensive compute at function boot time onto a separate pool, we were able to enable the use of synchronous File APIs during function boot time. Some libraries only support synchronous File APIs (eg, SQLite), and this would allow you to set them up on Edge Functions before it starts processing requests.

You can now safely use the following synchronous Deno APIs (and their Node counterparts) _during_ initial script evaluation:

- Deno.statSync
- Deno.removeSync
- Deno.writeFileSync
- Deno.writeTextFileSync
- Deno.readFileSync
- Deno.readTextFileSync
- Deno.mkdirSync
- Deno.makeTempDirSync
- Deno.readDirSync

**Keep in mind** that the sync APIs are available only during initial script evaluation and aren’t supported in callbacks like HTTP handlers or setTimeout.

```tsx
Deno.statSync('...') // ✅

setTimeout(() => {
Deno.statSync('...') // 💣 ERROR! Deno.statSync is blocklisted on the current context
})

Deno.serve(() => {
Deno.statSync('...') // 💣 ERROR! Deno.statSync is blocklisted on the current context
})
```

## Try it on Preview Today

These changes will be rolled out along with the Deno 2 upgrade to all clusters within the next 2 weeks. Meanwhile, you can use the Preview cluster if you'd like to try them out today. Please see [this guide](https://github.com/orgs/supabase/discussions/36814) on how to test your functions in Preview cluster.
7 changes: 6 additions & 1 deletion apps/www/components/Hero/Hero.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,12 @@ const Hero = () => {
<AnnouncementBadge
url="/launch-week#main-stage"
badge="LW15"
announcement={`Day 4: ${announcement.launch}`}
announcement={
<>
<span className="hidden md:inline">Day 5: </span>
{announcement.launch}
</>
}
className="lg:-mt-8 mb-4 lg:mb-0"
hasArrow
/>
Expand Down
2 changes: 1 addition & 1 deletion apps/www/components/LaunchWeek/15/LW15MainStage.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -198,7 +198,7 @@ const CardsSlider: React.FC<Props> = ({
ref={swiperRef}
onSwiper={setControlledSwiper}
modules={[Controller, Navigation, A11y]}
initialSlide={3}
initialSlide={4}
spaceBetween={8}
slidesPerView={1.5}
breakpoints={{
Expand Down
Loading
Loading