diff --git a/.claude/skills/teable-v2-devtools/SKILL.md b/.claude/skills/teable-v2-devtools/SKILL.md new file mode 100644 index 0000000000..101aeff4ee --- /dev/null +++ b/.claude/skills/teable-v2-devtools/SKILL.md @@ -0,0 +1,609 @@ +name: teable-v2-devtools +description: Teable v2 developer tools CLI for debugging, inspecting, and generating test data. Combines debug-data and mock-records capabilities into a unified CLI using Effect CLI framework. + +--- + +# Teable V2 DevTools CLI + +## When to Use This Skill + +Use this skill when you need to: + +- View table/field configuration details +- Diagnose formula/lookup/rollup issues +- Understand field dependency relationships +- Analyze computed field update plans (explain commands) +- Generate mock/test data for tables +- **Query records data** (via application layer or direct database access) +- **Create, update, delete records** (via application layer commands) +- **Check database schema** (indexes, constraints, columns) for missing or broken indexes +- **Create tables** (via CLI, without records) + +> **Important**: When you need to inspect database data, **prefer DevTools CLI over psql**. DevTools outputs structured TOON format, which is easier for AI analysis and supports comparing application-layer and database-layer results. + +## Development Notes (CRITICAL) + +### Rebuild After Modifying Dependencies + +DevTools CLI depends on multiple v2 packages, and it uses **compiled dist outputs** rather than TypeScript sources. This is because it relies on parameter decorators (`@inject()`), which tsx/esbuild cannot run directly. + +**If you modify any of the following packages, you must rebuild them before running the CLI:** + +```bash +# If you modify adapter-table-repository-postgres +pnpm --filter @teable/v2-adapter-table-repository-postgres build + +# If you modify command-explain +pnpm --filter @teable/v2-command-explain build + +# If you modify debug-data +pnpm --filter @teable/v2-debug-data build + +# If you modify core +pnpm --filter @teable/v2-core build +``` + +**Recommended: use watch mode for auto-rebuilds** + +```bash +# Start watch mode in a separate terminal +pnpm --filter @teable/v2-adapter-table-repository-postgres dev +``` + +**Common pitfalls:** + +- CLI output doesn’t change after code edits → forgot to rebuild +- console.log/console.error never prints → forgot to rebuild +- newly added types/functions missing → forgot to rebuild + +## Quick Commands + +All commands output TOON format for AI consumption. + +### Debug Commands + +```bash +# View underlying table metadata +pnpm --filter @teable/v2-devtools cli underlying table --table-id tbl... + +# List all tables in a base +pnpm --filter @teable/v2-devtools cli underlying tables --base-id bse... + +# View field configuration (diagnose formula issues) +pnpm --filter @teable/v2-devtools cli underlying field --field-id fld... + +# List all fields in a table +pnpm --filter @teable/v2-devtools cli underlying fields --table-id tbl... + +# View field dependencies (diagnose computed field propagation) +pnpm --filter @teable/v2-devtools cli relations --field-id fld... --direction up --level 2 + +# Explain CreateRecord (analyze computed update plan) +pnpm --filter @teable/v2-devtools cli explain create --table-id tbl... + +# Explain UpdateRecord +pnpm --filter @teable/v2-devtools cli explain update --table-id tbl... --record-id rec... --fields '{"Name":"test"}' + +# Explain DeleteRecords +pnpm --filter @teable/v2-devtools cli explain delete --table-id tbl... --record-ids rec1,rec2 +``` + +### Schema Check Commands + +Use these commands to verify database schema integrity, especially when you suspect missing indexes might be causing slow queries. + +```bash +# Check all fields in a table for missing indexes, constraints, columns +pnpm --filter @teable/v2-devtools cli schema table --table-id tbl... + +# Check a specific field for missing schema elements +pnpm --filter @teable/v2-devtools cli schema field --table-id tbl... --field-id fld... +``` + +### Records Query Commands + +```bash +# List records via application layer (stored mode - pre-computed values) +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --limit 100 --offset 0 + +# List records via application layer (computed mode - calculated on-the-fly) +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --mode computed + +# Get single record via application layer +pnpm --filter @teable/v2-devtools cli records get --table-id tbl... --record-id rec... + +# List records directly from underlying PostgreSQL table (raw data) +pnpm --filter @teable/v2-devtools cli underlying records --table-id tbl... --limit 100 + +# Get single record directly from underlying PostgreSQL table +pnpm --filter @teable/v2-devtools cli underlying record --table-id tbl... --record-id rec... +``` + +### Records Mutation Commands + +```bash +# Create a new record +pnpm --filter @teable/v2-devtools cli records create --table-id tbl... --fields '{"Name":"New Record"}' + +# Create a record with typecast (auto-convert values) +pnpm --filter @teable/v2-devtools cli records create --table-id tbl... --fields '{"Name":"Test"}' --typecast + +# Update an existing record +pnpm --filter @teable/v2-devtools cli records update --table-id tbl... --record-id rec... --fields '{"Name":"Updated Name"}' + +# Update with typecast +pnpm --filter @teable/v2-devtools cli records update --table-id tbl... --record-id rec... --fields '{"Status":"Done"}' --typecast + +# Delete records (comma-separated IDs) +pnpm --filter @teable/v2-devtools cli records delete --table-id tbl... --record-ids rec1,rec2,rec3 +``` + +### Mock Data Commands + +```bash +# Generate 100 mock records +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 100 + +# Generate with reproducible seed +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 50 --seed 12345 + +# Dry run (preview without inserting) +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 10 --dry-run +``` + +### Table Management Commands + +```bash +# Create a simple table with default fields (just a primary Name field) +pnpm --filter @teable/v2-devtools cli tables create --base-id bse... --name "My Table" + +# Create a table with custom fields +pnpm --filter @teable/v2-devtools cli tables create --base-id bse... --name "Tasks" --fields '[{"type":"singleLineText","name":"Title","isPrimary":true},{"type":"singleSelect","name":"Status","options":{"choices":[{"name":"Todo"},{"name":"Done"}]}},{"type":"date","name":"Due Date"}]' +``` + +## Command Reference + +### underlying Commands + +| Command | Description | +| ---------------------------------------------------- | -------------------------------------------------------------------- | +| `underlying table --table-id ` | Get raw table metadata | +| `underlying tables --base-id ` | List all tables in a base | +| `underlying field --field-id ` | Get field metadata (includes parsed options/meta JSON) | +| `underlying fields --table-id ` | List all fields in a table | +| `underlying records --table-id ` | List records directly from PostgreSQL (raw data with system columns) | +| `underlying record --table-id --record-id ` | Get single record directly from PostgreSQL | + +### records Commands (Application Layer) + +| Command | Description | +| ----------------------------------------------------------------- | ----------------------------------------- | +| `records list --table-id ` | List records via query repository | +| `records get --table-id --record-id ` | Get single record via query repository | +| `records create --table-id --fields ` | Create a new record via command bus | +| `records update --table-id --record-id --fields ` | Update an existing record via command bus | +| `records delete --table-id --record-ids ` | Delete records via command bus | + +**Records Query Options:** +| Option | Description | +|--------|-------------| +| `--table-id ` | Required: Table ID | +| `--record-id ` | Required for get: Record ID | +| `--limit ` | Max records to return (default: 100) | +| `--offset ` | Records to skip (default: 0) | +| `--mode stored\|computed` | Query mode (default: stored) | + +**Mode Explanation:** + +- `stored`: Read pre-computed values from the database (fast, uses cached values) +- `computed`: Calculate field values on-the-fly (slower, always fresh) + +**Records Mutation Options:** +| Option | Description | +|--------|-------------| +| `--table-id ` | Required: Table ID | +| `--record-id ` | Required for update: Record ID | +| `--record-ids ` | Required for delete: Comma-separated record IDs | +| `--fields ` | JSON object of field values (required for update, optional for create) | +| `--typecast` | Enable typecast mode to auto-convert values (default: false) | + +**Typecast Mode:** + +When `--typecast` is enabled, the system will attempt to convert input values to the correct field types: + +- String "123" → Number 123 +- Link field titles → Link field record IDs +- Date strings → Date objects + +### relations Command + +| Option | Description | +| ---------------------------- | ------------------------------------------------------------------- | +| `--field-id ` | Required: Starting field ID | +| `--direction up\|down\|both` | `up` = who depends on me, `down` = what I depend on (default: both) | +| `--level ` | Max traversal depth (default: unlimited) | +| `--same-table` | Only traverse same-table relations | + +### schema Commands + +Use these commands when analyzing slow queries or suspecting missing indexes. + +| Command | Description | +| ---------------------------------------------- | --------------------------------------------- | +| `schema table --table-id ` | Check all fields in a table for schema issues | +| `schema field --table-id --field-id ` | Check a specific field for schema issues | + +**Schema Check Output:** + +The output includes a summary with: + +- `total`: Total number of schema rules checked +- `success`: Rules that passed validation +- `errors`: Critical issues (missing indexes, columns, constraints) +- `warnings`: Non-critical issues + +Each result item includes: + +- `fieldId`, `fieldName`: The field being checked +- `ruleId`: Type of rule (e.g., `index`, `unique_index`, `fk_column`, `fk`) +- `ruleDescription`: Human-readable description +- `status`: `success`, `error`, or `warn` +- `message`: Details about the issue +- `details.missing`: List of missing schema objects (index names, column names, etc.) + +**Rule Types Checked:** +| Rule Type | Description | +|-----------|-------------| +| `column` | Physical column exists | +| `fk_column` | Foreign key column exists | +| `index` | Non-unique index exists (for FK lookups) | +| `unique_index` | Unique index exists (for one-to-one relations) | +| `fk` | Foreign key constraint exists | +| `junction_table` | Junction table exists (many-to-many) | +| `junction_index` | Junction table indexes exist | +| `junction_fk` | Junction table foreign keys exist | +| `generated_column` | Generated column (auto-number, created_time, etc.) | + +### explain Commands + +| Command | Description | +| ----------------------------------------------------------------- | ----------------------------- | +| `explain create --table-id ` | Explain CreateRecord command | +| `explain update --table-id --record-id --fields ` | Explain UpdateRecord command | +| `explain delete --table-id --record-ids ` | Explain DeleteRecords command | + +**Explain Options:** +| Option | Description | +|--------|-------------| +| `--table-id ` | Required: Table ID | +| `--record-id ` | Required for update: Record ID | +| `--record-ids ` | Required for delete: Comma-separated record IDs | +| `--fields ` | JSON object of field values (required for update, optional for create) | +| `--analyze` | Run EXPLAIN ANALYZE for actual execution stats (default: false) | + +### mock Commands + +| Option | Description | +| ------------------ | --------------------------------------------------------- | +| `--table-id ` | Required: Table ID to generate records for | +| `--count ` | Required: Number of records to generate | +| `--seed ` | Optional: Seed for reproducible random data | +| `--batch-size ` | Optional: Batch size for insertion (default: 100) | +| `--dry-run` | Optional: Only show what would be generated, don't insert | + +**Supported Field Types for Mock Data:** + +| Field Type | Generated Data | +| -------------- | ------------------------------------------ | +| SingleLineText | Names/emails/URLs/phones (based on showAs) | +| LongText | Lorem ipsum paragraphs | +| Number | Random floats 0-1000 | +| Rating | Random integers 1 to max rating | +| SingleSelect | Random selection from options | +| MultipleSelect | 1-3 random options | +| Checkbox | Random boolean | +| Date | Recent date within 365 days | +| User | Mock user object `{id, title, email}` | +| Attachment | Mock attachment objects | +| Link | Random IDs from linked table | + +### tables Commands + +| Command | Description | +| -------------------------------------------- | ------------------------------------------------------ | +| `tables create --base-id --name ` | Create a new table (without records) | +| `tables describe-schema` | **Output field schema documentation for AI reference** | + +> **Important**: Before creating tables, **you must run `tables describe-schema`** to get the full field schema documentation and avoid validation errors. + +**tables create Options:** +| Option | Description | +|--------|-------------| +| `--base-id ` | Required: Base ID where table will be created | +| `--name ` | Required: Table name | +| `--fields ` | Optional: JSON array of field definitions | + +**Critical validation rules (must follow):** + +1. **SingleSelect/MultipleSelect choices must include a `color` property** - e.g. `{"name": "Todo", "color": "blueLight1"}` +2. **Link fields must include `foreignTableId` and `lookupFieldId`** - query the target table to get these IDs first +3. **Each table can only have one field with `isPrimary: true`** + +**Field Definition Format:** + +```json +[ + { "type": "singleLineText", "name": "Title", "isPrimary": true }, + { "type": "number", "name": "Amount" }, + { "type": "date", "name": "Due Date" }, + { + "type": "singleSelect", + "name": "Status", + "options": { + "choices": [ + { "name": "Todo", "color": "grayLight1" }, + { "name": "Done", "color": "greenLight1" } + ] + } + }, + { "type": "checkbox", "name": "Completed" } +] +``` + +**Link Field Example:** + +```json +{ + "type": "link", + "name": "Company", + "options": { + "relationship": "manyOne", + "foreignTableId": "tblXXXXXXXX", + "lookupFieldId": "fldYYYYYYYY" + } +} +``` + +- `relationship`: `oneOne` (1:1), `oneMany` (1:N), `manyOne` (N:1), `manyMany` (N:N) +- `lookupFieldId`: Primary field ID of the foreign table (usually the first field) + +**Supported Field Types:** + +- `singleLineText`, `longText`, `number`, `date`, `checkbox` +- `singleSelect`, `multipleSelect` (requires `options.choices` with color) +- `rating`, `attachment`, `user` +- `link` (requires `options.foreignTableId`, `options.lookupFieldId`, `options.relationship`) +- `formula`, `rollup`, `lookup` (computed fields) +- `autoNumber`, `createdTime`, `lastModifiedTime`, `createdBy`, `lastModifiedBy` + +**Valid Colors for Select Choices:** +`blueLight2`, `blueLight1`, `blueBright`, `blue`, `blueDark1`, +`cyanLight2`, `cyanLight1`, `cyanBright`, `cyan`, `cyanDark1`, +`grayLight2`, `grayLight1`, `grayBright`, `gray`, `grayDark1`, +`greenLight2`, `greenLight1`, `greenBright`, `green`, `greenDark1`, +`orangeLight2`, `orangeLight1`, `orangeBright`, `orange`, `orangeDark1`, +`pinkLight2`, `pinkLight1`, `pinkBright`, `pink`, `pinkDark1`, +`purpleLight2`, `purpleLight1`, `purpleBright`, `purple`, `purpleDark1`, +`redLight2`, `redLight1`, `redBright`, `red`, `redDark1`, +`tealLight2`, `tealLight1`, `tealBright`, `teal`, `tealDark1`, +`yellowLight2`, `yellowLight1`, `yellowBright`, `yellow`, `yellowDark1` + +## Common Diagnostic Scenarios + +### Scenario 1: Formula Field Calculation Error + +1. View field config: `underlying field --field-id fld...` +2. Check dependencies: `relations --field-id fld... --direction down` +3. Verify dependent fields are correct + +### Scenario 2: Lookup/Rollup Data Inconsistency + +1. View lookup field config: `underlying field --field-id fld...` +2. Check `lookupOptions`: linkFieldId, foreignTableId, lookupFieldId +3. Verify the linked link field is correct + +### Scenario 3: Field Update Not Propagating + +1. Find downstream dependents: `relations --field-id fld... --direction up --level 3` +2. Check if any dependent field has errors: look for `hasError: true` +3. View specific field config: `underlying field --field-id ` + +### Scenario 4: Analyze Computed Update Performance + +1. Explain the command: `explain create --table-id tbl...` +2. Check `computedImpact.updateSteps` for the update plan +3. Cross-check dependencies using `relations` on key fields (formula/link/lookup/rollup) to confirm `reference`-derived edges are present; do not rely solely on explain output. +4. Look at `complexity.score` and `recommendations` +5. Use `--analyze` flag for actual execution timing + +### Scenario 5: Data Inconsistency Between Application and Database + +When data shown in the UI doesn't match what you expect, compare application layer and database layer: + +1. **Query via application layer (stored mode)**: + + ```bash + pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --limit 10 --mode stored + ``` + +2. **Query via application layer (computed mode)**: + + ```bash + pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --limit 10 --mode computed + ``` + +3. **Query directly from database**: + ```bash + pnpm --filter @teable/v2-devtools cli underlying records --table-id tbl... --limit 10 + ``` + +**Compare the results:** + +- If `stored` ≠ `computed`: The stored cache is stale, computed values haven't been persisted +- If `stored` ≠ `underlying`: Application layer transformation issue +- If `computed` ≠ `underlying`: Field calculation logic issue + +### Scenario 6: Creating and Managing Test Records + +When you need to quickly create, update, or delete test records for debugging: + +1. **Create a test record**: + + ```bash + pnpm --filter @teable/v2-devtools cli records create --table-id tbl... --fields '{"Name":"Test Record","Status":"Todo"}' + ``` + +2. **Update the record** (use the recordId from step 1): + + ```bash + pnpm --filter @teable/v2-devtools cli records update --table-id tbl... --record-id rec... --fields '{"Status":"Done"}' + ``` + +3. **Delete test records when done**: + ```bash + pnpm --filter @teable/v2-devtools cli records delete --table-id tbl... --record-ids rec1,rec2 + ``` + +**Tip:** Use `--typecast` when you want to input human-readable values (like link field titles instead of record IDs). + +### Scenario 7: Slow Query Performance (Missing Indexes) + +When queries are slow, especially for Link fields or tables with many records: + +1. **Check schema for the entire table**: + + ```bash + pnpm --filter @teable/v2-devtools cli schema table --table-id tbl... + ``` + +2. **Look for errors in the output**, especially: + + - `index:*` rules with `status: error` - missing index on foreign key column + - `unique_index:*` rules - missing unique index for one-to-one relations + - `junction_index:*` rules - missing indexes on junction tables (many-to-many) + +3. **Check a specific Link field**: + + ```bash + pnpm --filter @teable/v2-devtools cli schema field --table-id tbl... --field-id fldLinkField + ``` + +4. **Common missing index patterns**: + - Link field (one-to-many): Should have `index` on `fld_{fieldId}__id` column + - Link field (one-to-one): Should have `unique_index` on `fld_{fieldId}__id` column + - Link field (many-to-many): Junction table should have indexes on both FK columns + +## Global Options + +- `-c, --connection ` - Override DATABASE_URL/PRISMA_DATABASE_URL +- `--help` - Show help message + +## Connection + +Connection is resolved in the following order: + +1. `-c, --connection ` command line option +2. `PRISMA_DATABASE_URL` environment variable +3. `DATABASE_URL` environment variable +4. Default: `postgresql://teable:teable@127.0.0.1:5432/teable?schema=public` + +## PGlite Mode (Temporary Database) + +DevTools supports **pglite** for file-persisted temporary databases. This is useful for testing table creation and other operations without a real PostgreSQL server. + +### When to Use PGlite + +Use pglite (`pglite://` connection string) when: + +- **Creating temporary tables for testing** - no existing database needed +- **Testing table schema designs** before deploying to production +- **Isolated experiments** that shouldn't affect real data +- **No PostgreSQL server available** (local development without Docker) + +### When NOT to Use PGlite + +Do NOT use pglite when: + +- **User provided a real database URL** (postgresql://) +- **Verifying existing IDs** (tableId, fieldId, recordId, baseId) +- **Querying production/development data** +- **Debugging issues with real tables** + +### PGlite Connection String Format + +``` +pglite:// +``` + +Examples: + +- `pglite://.pglite-data/session-001` (relative path) +- `pglite:///absolute/path/to/data` (absolute path) + +### Using PGlite + +**Step 1: Create a pglite session** + +First, create a table with a unique pglite connection string. The CLI will automatically: + +- Create the data directory +- Initialize the database schema +- Create a space and base +- Return the generated baseId + +```bash +# Create a new pglite session with a table +pnpm --filter @teable/v2-devtools cli tables create \ + --connection "pglite://.pglite-data/session-$(date +%s)" \ + --base-id "bseXXXXXXXXXXXXX" \ + --name "Test Table" +``` + +> **Note**: For the first command, you need to provide any baseId (it will be created). Check the output for the actual baseId to use in subsequent commands. + +**Step 2: Reuse the same session** + +In the same conversation/session, **remember and reuse** the same connection string and baseId: + +```bash +# Query tables in the same pglite database +pnpm --filter @teable/v2-devtools cli underlying tables \ + --connection "pglite://.pglite-data/session-1234567890" \ + --base-id "bseXXXXXXXXXXXXX" + +# Create more tables in the same base +pnpm --filter @teable/v2-devtools cli tables create \ + --connection "pglite://.pglite-data/session-1234567890" \ + --base-id "bseXXXXXXXXXXXXX" \ + --name "Another Table" +``` + +### Important Notes for AI + +1. **Remember the session**: Store the pglite connection string and baseId for the entire conversation +2. **Data persists in files**: Data is saved to `.pglite-data/` directory (git-ignored) +3. **Isolated sessions**: Each unique path creates a separate database +4. **First-time init**: The first command to a new pglite path will initialize schema + space + base + +### Data Storage + +PGlite data is stored in: + +``` +packages/v2/devtools/.pglite-data/ +├── session-1234567890/ +│ ├── ... (pglite database files) +├── session-0987654321/ +│ └── ... +``` + +This directory is git-ignored and can be safely deleted to clean up test data. + +## Empty Data Handling + +When queries return no data, the CLI provides clear feedback: + +- `code: EMPTY_RESULT` indicates no data was found +- The error message includes hints about what to check + +**If you see EMPTY_RESULT, report to the user** that the requested data was not found in the database. diff --git a/.claude/skills/teable-v2-package-guide/SKILL.md b/.claude/skills/teable-v2-package-guide/SKILL.md new file mode 100644 index 0000000000..53f911bf84 --- /dev/null +++ b/.claude/skills/teable-v2-package-guide/SKILL.md @@ -0,0 +1,209 @@ +# Teable V2 Package Creation Guide + +## When to Use This Skill + +Use this skill when you need to: +- Create a new v2 package +- Configure package.json for v2 packages +- Set up TypeScript configuration +- Understand the v2 package conventions + +## Package Location + +All v2 packages are located in `packages/v2/` directory. + +## Creating a New Package + +### 1. Directory Structure + +``` +packages/v2// +├── src/ +│ └── index.ts # Main entry point +├── package.json +├── tsconfig.json +├── tsconfig.build.json +└── tsdown.config.ts # Optional, for custom build config +``` + +### 2. Package.json Configuration + +**IMPORTANT: Development-Friendly Exports** + +The key to avoiding rebuilds during development is the `exports` configuration: + +```json +{ + "name": "@teable/v2-", + "version": "0.0.0", + "private": true, + "license": "MIT", + "type": "module", + "sideEffects": false, + "main": "dist/index.cjs", + "module": "dist/index.js", + "types": "src/index.ts", + "exports": { + ".": { + "types": "./src/index.ts", + "import": "./src/index.ts", + "module": "./dist/index.js", + "require": "./dist/index.cjs" + } + }, + "files": [ + "dist", + "src" + ], + "scripts": { + "build": "tsdown --tsconfig tsconfig.build.json", + "dev": "tsdown --tsconfig tsconfig.build.json --watch", + "clean": "rimraf ./dist ./coverage ./tsconfig.tsbuildinfo ./tsconfig.build.tsbuildinfo ./.eslintcache", + "lint": "eslint . --ext .ts,.js,.mjs,.cjs,.mts,.cts --cache --cache-location ../../../.cache/eslint/v2-.eslintcache", + "typecheck": "tsc --project ./tsconfig.json --noEmit", + "test-unit": "vitest run --silent", + "test-unit-cover": "pnpm test-unit --coverage", + "fix-all-files": "eslint . --ext .ts,.js,.mjs,.cjs,.mts,.cts --fix" + }, + "dependencies": { + // Add your dependencies here + }, + "devDependencies": { + "@teable/v2-tsdown-config": "workspace:*", + "@teable/eslint-config-bases": "workspace:^", + "@types/node": "22.18.0", + "@vitest/coverage-v8": "4.0.16", + "eslint": "8.57.0", + "prettier": "3.2.5", + "rimraf": "5.0.5", + "tsdown": "0.18.1", + "typescript": "5.4.3", + "vite-tsconfig-paths": "4.3.2", + "vitest": "4.0.16" + } +} +``` + +### Key Export Configuration Explained + +```json +"exports": { + ".": { + "types": "./src/index.ts", // TypeScript types from source + "import": "./src/index.ts", // ESM import uses source directly + "module": "./dist/index.js", // Bundlers use built output + "require": "./dist/index.cjs" // CommonJS uses built output + } +} +``` + +**Why this works:** +- `types` and `import` point to `./src/index.ts` - this allows other packages to import TypeScript source directly during development +- `module` and `require` point to `./dist/` - this is used by bundlers and production builds +- No rebuild needed when you change source code - other packages see changes immediately + +### 3. TypeScript Configuration + +**tsconfig.json** (for development/typechecking): +```json +{ + "extends": "../../../tsconfig.base.json", + "compilerOptions": { + "outDir": "./dist", + "rootDir": "./src", + "types": ["node"] + }, + "include": ["src/**/*"], + "exclude": ["node_modules", "dist"] +} +``` + +**tsconfig.build.json** (for building): +```json +{ + "extends": "./tsconfig.json", + "compilerOptions": { + "declaration": true, + "declarationMap": true, + "sourceMap": true + }, + "exclude": ["node_modules", "dist", "**/*.spec.ts", "**/*.test.ts"] +} +``` + +### 4. Build Configuration (Optional) + +**tsdown.config.ts**: +```typescript +import { defineConfig } from '@teable/v2-tsdown-config'; + +export default defineConfig(); +``` + +## Common Patterns + +### Workspace Dependencies + +Use `workspace:*` for internal dependencies: +```json +"dependencies": { + "@teable/v2-core": "workspace:*", + "@teable/v2-di": "workspace:*" +} +``` + +### Dependency Injection + +V2 packages use `@teable/v2-di` for DI: +```typescript +import { injectable, inject } from '@teable/v2-di'; + +@injectable() +export class MyService { + constructor( + @inject(someToken) private readonly dependency: SomeDependency + ) {} +} +``` + +### Error Handling + +Use `neverthrow` for Result types: +```typescript +import { ok, err, Result } from 'neverthrow'; +import { domainError, type DomainError } from '@teable/v2-core'; + +function doSomething(): Result { + if (error) { + return err(domainError.invariant({ message: 'Something went wrong' })); + } + return ok(data); +} +``` + +## Checklist for New Package + +- [ ] Create directory `packages/v2//` +- [ ] Create `src/index.ts` with exports +- [ ] Configure `package.json` with correct exports (pointing to source) +- [ ] Create `tsconfig.json` and `tsconfig.build.json` +- [ ] Add to root `pnpm-workspace.yaml` if needed +- [ ] Run `pnpm install` to link workspace dependencies +- [ ] Verify imports work without building: `pnpm typecheck` + +## Troubleshooting + +### "Cannot find module" errors +1. Check that `exports` in `package.json` points to correct paths +2. Ensure `types` field points to `src/index.ts` +3. Run `pnpm install` to refresh workspace links + +### Changes not reflected in other packages +1. Verify `exports.import` points to `./src/index.ts` (not `./dist/`) +2. Check if the consuming package has cached the old build +3. Restart TypeScript server in your IDE + +### Build errors +1. Ensure `tsconfig.build.json` excludes test files +2. Check that all dependencies are properly declared +3. Run `pnpm clean` before rebuilding diff --git a/.claude/skills/teable-v2-table-template/SKILL.md b/.claude/skills/teable-v2-table-template/SKILL.md new file mode 100644 index 0000000000..21c211d29f --- /dev/null +++ b/.claude/skills/teable-v2-table-template/SKILL.md @@ -0,0 +1,65 @@ +--- +name: teable-v2-table-template +description: Create or update Teable v2 table templates in packages/v2/table-templates (template seeds, fields, records, and exports). +--- + +# Teable v2 Table Template Skill + +Use this skill when you need to add or modify table templates in the v2 codebase. Templates live in `packages/v2/table-templates/src/index.ts`. + +## Quick workflow + +1. Open the template source file: `packages/v2/table-templates/src/index.ts`. +2. Add a seed builder: + - Single table: create a `createXSeed(): SingleTableSeed`. + - Multi table: create a `createXTemplateSeed(): TemplateSeed`. +3. Use helpers for IDs and select options: + - `createFieldId()` for field IDs + - `createTableId()` if you must predefine table IDs + - `createSelectOption()` for single/multi-select choices +4. Create the template definition: + - Single table: `singleTable(key, name, description, createXSeed, defaultRecordCount)` + - Multi table: `createTemplate(key, name, description, createXTemplateSeed, defaultRecordCount)` +5. Export the template and add it to `tableTemplates` array. +6. If you need a field-only helper, export `createXFields = () => createXSeed().fields;`. + +## Notes and conventions + +- Keep templates in `packages/v2/table-templates/src/index.ts` (this package is the single source of truth). +- The `createInput` generator in `TableTemplateDefinition` handles optional record seeding and name prefixing. You only need to supply seed fields and records. +- Prefer `singleTable(...)` unless the template truly needs multiple tables (e.g., CRM with Companies + Contacts). +- Use string keys that are stable and URL-safe (e.g., `content-calendar`, `bug-triage`). +- When seeding records, keep records small and representative; use `normalizeTemplateRecords` behavior to cap or pad. +- New templates should cover as many field types as possible, as long as the business context makes sense (use `allFieldTypesTemplate` for inspiration). + +## Example pattern + +```ts +const createMyTemplateSeed = (): SingleTableSeed => { + const nameFieldId = createFieldId(); + return { + fields: [{ type: 'singleLineText', id: nameFieldId, name: 'Name' }], + records: [{ fields: { [nameFieldId]: 'Example' } }], + }; +}; + +export const myTemplate = singleTable( + 'my-template', + 'My Template', + 'Short description.', + createMyTemplateSeed, + 1 +); + +export const tableTemplates = [ + // ...existing templates, + myTemplate, +] as const; +``` + +## References + +- Source of truth: `packages/v2/table-templates/src/index.ts` +- Package note: `packages/v2/table-templates/ARCHITECTURE.md` +- E2E contract: ensure `creates tables for every template with seeded records` passes in `packages/v2/e2e`. +- Suggested run: `pnpm -C packages/v2/e2e test -- --runInBand --testNamePattern "creates tables for every template with seeded records"` diff --git a/.claude/skills/teable-v2-test-debug/SKILL.md b/.claude/skills/teable-v2-test-debug/SKILL.md new file mode 100644 index 0000000000..4282d0de6f --- /dev/null +++ b/.claude/skills/teable-v2-test-debug/SKILL.md @@ -0,0 +1,73 @@ +--- +name: teable-v2-test-debug +description: Debug Teable v2 tests and failing test cases by prioritizing data reproduction and inspection. Use when asked to debug a test file/spec (unit, integration, or e2e) in packages/v2/*, especially when failures might be caused by table schema, relations, or stored/computed data drift; workflow uses v2-devtools CLI to create a similar table first, then inspects real DB data/relations, and only then reviews code logic. +--- + +# Teable V2 Test Debug + +## Overview + +Follow a data-first debugging workflow for Teable v2 tests. The default order is: reproduce data with devtools, inspect real DB data/relations, then analyze code logic. + +## Workflow: Data-first test debugging + +### 1) Capture failure context + +- Identify the failing test name, file path, and the exact assertion that failed. +- Note the expected vs actual values and any IDs shown in logs (base/table/field/record). +- If the failure is e2e or integration, confirm which base or seed data was used. + +### 2) Reproduce with devtools first (create similar table) + +- Use the v2-devtools CLI to create a minimal table that mirrors the test schema. +- Prefer CLI-based table creation and mock data over hand-written SQL. +- If the schema is complex, build only the fields involved in the failing assertion. + +Common commands: + +```bash +# Get field schema documentation before creating tables +pnpm --filter @teable/v2-devtools cli tables describe-schema + +# Create a table with minimal fields +pnpm --filter @teable/v2-devtools cli tables create --base-id bse... --name "Test Table" --fields '[{"type":"singleLineText","name":"Name","isPrimary":true}]' + +# Generate mock records if data shape matters +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 10 --seed 12345 +``` + +If the v2-devtools skill exists, open `/Users/nichenqin/projects/teable/.codex/skills/teable-v2-devtools/SKILL.md` for the full command reference and validation rules. + +### 3) Inspect real DB data and relations + +- Compare application layer vs underlying data first; use stored/computed modes. +- Inspect dependencies and relations to confirm lookup/rollup/formula inputs. +- Validate schema constraints if missing indexes or FK columns are suspected. + +Common commands: + +```bash +# App-layer data (stored/computed) vs underlying +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --mode stored --limit 10 +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --mode computed --limit 10 +pnpm --filter @teable/v2-devtools cli underlying records --table-id tbl... --limit 10 + +# Inspect a field and its dependencies +pnpm --filter @teable/v2-devtools cli underlying field --field-id fld... +pnpm --filter @teable/v2-devtools cli relations --field-id fld... --direction up --level 2 + +# Check schema integrity if queries are slow or failing +pnpm --filter @teable/v2-devtools cli schema table --table-id tbl... +``` + +### 4) Only then review code logic + +- Map the observed data mismatch back to the handler, visitor, or mapper. +- Verify spec/visitor logic before touching application wiring. +- If the bug is only reproducible with real DB data, prefer adjusting fixtures or seeding rather than altering logic. + +### 5) Decide next action + +- If app-layer vs underlying differs, focus on computed/stored pipeline and mappers. +- If dependencies are wrong, fix field definitions or relation setup first. +- If reproduction fails on minimal data, debug core logic with a tight fixture. diff --git a/.codex/skills/teable-v2-devtools/SKILL.md b/.codex/skills/teable-v2-devtools/SKILL.md new file mode 100644 index 0000000000..25dc86cffc --- /dev/null +++ b/.codex/skills/teable-v2-devtools/SKILL.md @@ -0,0 +1,608 @@ +name: teable-v2-devtools +description: Teable v2 developer tools CLI for debugging, inspecting, and generating test data. Combines debug-data and mock-records capabilities into a unified CLI using Effect CLI framework. + +--- + +# Teable V2 DevTools CLI + +## When to Use This Skill + +Use this skill when you need to: + +- View table/field configuration details +- Diagnose formula/lookup/rollup issues +- Understand field dependency relationships +- Analyze computed field update plans (explain commands) +- Generate mock/test data for tables +- **Query records data** (via application layer or direct database access) +- **Create, update, delete records** (via application layer commands) +- **Check database schema** (indexes, constraints, columns) for missing or broken indexes +- **Create tables** (via CLI, without records) + +> **Important**: When you need to inspect database data, **prefer DevTools CLI over psql**. DevTools outputs structured TOON format, which is easier for AI analysis and supports comparing application-layer and database-layer results. + +## Development Notes (CRITICAL) + +### Rebuild After Modifying Dependencies + +DevTools CLI depends on multiple v2 packages, and it uses **compiled dist outputs** rather than TypeScript sources. This is because it relies on parameter decorators (`@inject()`), which tsx/esbuild cannot run directly. + +**If you modify any of the following packages, you must rebuild them before running the CLI:** + +```bash +# If you modify adapter-table-repository-postgres +pnpm --filter @teable/v2-adapter-table-repository-postgres build + +# If you modify command-explain +pnpm --filter @teable/v2-command-explain build + +# If you modify debug-data +pnpm --filter @teable/v2-debug-data build + +# If you modify core +pnpm --filter @teable/v2-core build +``` + +**Recommended: use watch mode for auto-rebuilds** + +```bash +# Start watch mode in a separate terminal +pnpm --filter @teable/v2-adapter-table-repository-postgres dev +``` + +**Common pitfalls:** + +- CLI output doesn’t change after code edits → forgot to rebuild +- console.log/console.error never prints → forgot to rebuild +- newly added types/functions missing → forgot to rebuild + +## Quick Commands + +All commands output TOON format for AI consumption. + +### Debug Commands + +```bash +# View underlying table metadata +pnpm --filter @teable/v2-devtools cli underlying table --table-id tbl... + +# List all tables in a base +pnpm --filter @teable/v2-devtools cli underlying tables --base-id bse... + +# View field configuration (diagnose formula issues) +pnpm --filter @teable/v2-devtools cli underlying field --field-id fld... + +# List all fields in a table +pnpm --filter @teable/v2-devtools cli underlying fields --table-id tbl... + +# View field dependencies (diagnose computed field propagation) +pnpm --filter @teable/v2-devtools cli relations --field-id fld... --direction up --level 2 + +# Explain CreateRecord (analyze computed update plan) +pnpm --filter @teable/v2-devtools cli explain create --table-id tbl... + +# Explain UpdateRecord +pnpm --filter @teable/v2-devtools cli explain update --table-id tbl... --record-id rec... --fields '{"Name":"test"}' + +# Explain DeleteRecords +pnpm --filter @teable/v2-devtools cli explain delete --table-id tbl... --record-ids rec1,rec2 +``` + +### Schema Check Commands + +Use these commands to verify database schema integrity, especially when you suspect missing indexes might be causing slow queries. + +```bash +# Check all fields in a table for missing indexes, constraints, columns +pnpm --filter @teable/v2-devtools cli schema table --table-id tbl... + +# Check a specific field for missing schema elements +pnpm --filter @teable/v2-devtools cli schema field --table-id tbl... --field-id fld... +``` + +### Records Query Commands + +```bash +# List records via application layer (stored mode - pre-computed values) +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --limit 100 --offset 0 + +# List records via application layer (computed mode - calculated on-the-fly) +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --mode computed + +# Get single record via application layer +pnpm --filter @teable/v2-devtools cli records get --table-id tbl... --record-id rec... + +# List records directly from underlying PostgreSQL table (raw data) +pnpm --filter @teable/v2-devtools cli underlying records --table-id tbl... --limit 100 + +# Get single record directly from underlying PostgreSQL table +pnpm --filter @teable/v2-devtools cli underlying record --table-id tbl... --record-id rec... +``` + +### Records Mutation Commands + +```bash +# Create a new record +pnpm --filter @teable/v2-devtools cli records create --table-id tbl... --fields '{"Name":"New Record"}' + +# Create a record with typecast (auto-convert values) +pnpm --filter @teable/v2-devtools cli records create --table-id tbl... --fields '{"Name":"Test"}' --typecast + +# Update an existing record +pnpm --filter @teable/v2-devtools cli records update --table-id tbl... --record-id rec... --fields '{"Name":"Updated Name"}' + +# Update with typecast +pnpm --filter @teable/v2-devtools cli records update --table-id tbl... --record-id rec... --fields '{"Status":"Done"}' --typecast + +# Delete records (comma-separated IDs) +pnpm --filter @teable/v2-devtools cli records delete --table-id tbl... --record-ids rec1,rec2,rec3 +``` + +### Mock Data Commands + +```bash +# Generate 100 mock records +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 100 + +# Generate with reproducible seed +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 50 --seed 12345 + +# Dry run (preview without inserting) +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 10 --dry-run +``` + +### Table Management Commands + +```bash +# Create a simple table with default fields (just a primary Name field) +pnpm --filter @teable/v2-devtools cli tables create --base-id bse... --name "My Table" + +# Create a table with custom fields +pnpm --filter @teable/v2-devtools cli tables create --base-id bse... --name "Tasks" --fields '[{"type":"singleLineText","name":"Title","isPrimary":true},{"type":"singleSelect","name":"Status","options":{"choices":[{"name":"Todo"},{"name":"Done"}]}},{"type":"date","name":"Due Date"}]' +``` + +## Command Reference + +### underlying Commands + +| Command | Description | +| ---------------------------------------------------- | -------------------------------------------------------------------- | +| `underlying table --table-id ` | Get raw table metadata | +| `underlying tables --base-id ` | List all tables in a base | +| `underlying field --field-id ` | Get field metadata (includes parsed options/meta JSON) | +| `underlying fields --table-id ` | List all fields in a table | +| `underlying records --table-id ` | List records directly from PostgreSQL (raw data with system columns) | +| `underlying record --table-id --record-id ` | Get single record directly from PostgreSQL | + +### records Commands (Application Layer) + +| Command | Description | +| -------------------------------------------------- | ----------------------------------------- | +| `records list --table-id ` | List records via query repository | +| `records get --table-id --record-id ` | Get single record via query repository | +| `records create --table-id --fields ` | Create a new record via command bus | +| `records update --table-id --record-id --fields ` | Update an existing record via command bus | +| `records delete --table-id --record-ids ` | Delete records via command bus | + +**Records Query Options:** +| Option | Description | +|--------|-------------| +| `--table-id ` | Required: Table ID | +| `--record-id ` | Required for get: Record ID | +| `--limit ` | Max records to return (default: 100) | +| `--offset ` | Records to skip (default: 0) | +| `--mode stored\|computed` | Query mode (default: stored) | + +**Mode Explanation:** + +- `stored`: Read pre-computed values from the database (fast, uses cached values) +- `computed`: Calculate field values on-the-fly (slower, always fresh) + +**Records Mutation Options:** +| Option | Description | +|--------|-------------| +| `--table-id ` | Required: Table ID | +| `--record-id ` | Required for update: Record ID | +| `--record-ids ` | Required for delete: Comma-separated record IDs | +| `--fields ` | JSON object of field values (required for update, optional for create) | +| `--typecast` | Enable typecast mode to auto-convert values (default: false) | + +**Typecast Mode:** + +When `--typecast` is enabled, the system will attempt to convert input values to the correct field types: +- String "123" → Number 123 +- Link field titles → Link field record IDs +- Date strings → Date objects + +### relations Command + +| Option | Description | +| ---------------------------- | ------------------------------------------------------------------- | +| `--field-id ` | Required: Starting field ID | +| `--direction up\|down\|both` | `up` = who depends on me, `down` = what I depend on (default: both) | +| `--level ` | Max traversal depth (default: unlimited) | +| `--same-table` | Only traverse same-table relations | + +### schema Commands + +Use these commands when analyzing slow queries or suspecting missing indexes. + +| Command | Description | +| ---------------------------------------------- | --------------------------------------------- | +| `schema table --table-id ` | Check all fields in a table for schema issues | +| `schema field --table-id --field-id ` | Check a specific field for schema issues | + +**Schema Check Output:** + +The output includes a summary with: + +- `total`: Total number of schema rules checked +- `success`: Rules that passed validation +- `errors`: Critical issues (missing indexes, columns, constraints) +- `warnings`: Non-critical issues + +Each result item includes: + +- `fieldId`, `fieldName`: The field being checked +- `ruleId`: Type of rule (e.g., `index`, `unique_index`, `fk_column`, `fk`) +- `ruleDescription`: Human-readable description +- `status`: `success`, `error`, or `warn` +- `message`: Details about the issue +- `details.missing`: List of missing schema objects (index names, column names, etc.) + +**Rule Types Checked:** +| Rule Type | Description | +|-----------|-------------| +| `column` | Physical column exists | +| `fk_column` | Foreign key column exists | +| `index` | Non-unique index exists (for FK lookups) | +| `unique_index` | Unique index exists (for one-to-one relations) | +| `fk` | Foreign key constraint exists | +| `junction_table` | Junction table exists (many-to-many) | +| `junction_index` | Junction table indexes exist | +| `junction_fk` | Junction table foreign keys exist | +| `generated_column` | Generated column (auto-number, created_time, etc.) | + +### explain Commands + +| Command | Description | +| ----------------------------------------------------------------- | ----------------------------- | +| `explain create --table-id ` | Explain CreateRecord command | +| `explain update --table-id --record-id --fields ` | Explain UpdateRecord command | +| `explain delete --table-id --record-ids ` | Explain DeleteRecords command | + +**Explain Options:** +| Option | Description | +|--------|-------------| +| `--table-id ` | Required: Table ID | +| `--record-id ` | Required for update: Record ID | +| `--record-ids ` | Required for delete: Comma-separated record IDs | +| `--fields ` | JSON object of field values (required for update, optional for create) | +| `--analyze` | Run EXPLAIN ANALYZE for actual execution stats (default: false) | + +### mock Commands + +| Option | Description | +| ------------------ | --------------------------------------------------------- | +| `--table-id ` | Required: Table ID to generate records for | +| `--count ` | Required: Number of records to generate | +| `--seed ` | Optional: Seed for reproducible random data | +| `--batch-size ` | Optional: Batch size for insertion (default: 100) | +| `--dry-run` | Optional: Only show what would be generated, don't insert | + +**Supported Field Types for Mock Data:** + +| Field Type | Generated Data | +| -------------- | ------------------------------------------ | +| SingleLineText | Names/emails/URLs/phones (based on showAs) | +| LongText | Lorem ipsum paragraphs | +| Number | Random floats 0-1000 | +| Rating | Random integers 1 to max rating | +| SingleSelect | Random selection from options | +| MultipleSelect | 1-3 random options | +| Checkbox | Random boolean | +| Date | Recent date within 365 days | +| User | Mock user object `{id, title, email}` | +| Attachment | Mock attachment objects | +| Link | Random IDs from linked table | + +### tables Commands + +| Command | Description | +| -------------------------------------------- | ------------------------------------------------------ | +| `tables create --base-id --name ` | Create a new table (without records) | +| `tables describe-schema` | **Output field schema documentation for AI reference** | + +> **Important**: Before creating tables, **you must run `tables describe-schema`** to get the full field schema documentation and avoid validation errors. + +**tables create Options:** +| Option | Description | +|--------|-------------| +| `--base-id ` | Required: Base ID where table will be created | +| `--name ` | Required: Table name | +| `--fields ` | Optional: JSON array of field definitions | + +**Critical validation rules (must follow):** + +1. **SingleSelect/MultipleSelect choices must include a `color` property** - e.g. `{"name": "Todo", "color": "blueLight1"}` +2. **Link fields must include `foreignTableId` and `lookupFieldId`** - query the target table to get these IDs first +3. **Each table can only have one field with `isPrimary: true`** + +**Field Definition Format:** + +```json +[ + { "type": "singleLineText", "name": "Title", "isPrimary": true }, + { "type": "number", "name": "Amount" }, + { "type": "date", "name": "Due Date" }, + { + "type": "singleSelect", + "name": "Status", + "options": { + "choices": [ + { "name": "Todo", "color": "grayLight1" }, + { "name": "Done", "color": "greenLight1" } + ] + } + }, + { "type": "checkbox", "name": "Completed" } +] +``` + +**Link Field Example:** + +```json +{ + "type": "link", + "name": "Company", + "options": { + "relationship": "manyOne", + "foreignTableId": "tblXXXXXXXX", + "lookupFieldId": "fldYYYYYYYY" + } +} +``` + +- `relationship`: `oneOne` (1:1), `oneMany` (1:N), `manyOne` (N:1), `manyMany` (N:N) +- `lookupFieldId`: Primary field ID of the foreign table (usually the first field) + +**Supported Field Types:** + +- `singleLineText`, `longText`, `number`, `date`, `checkbox` +- `singleSelect`, `multipleSelect` (requires `options.choices` with color) +- `rating`, `attachment`, `user` +- `link` (requires `options.foreignTableId`, `options.lookupFieldId`, `options.relationship`) +- `formula`, `rollup`, `lookup` (computed fields) +- `autoNumber`, `createdTime`, `lastModifiedTime`, `createdBy`, `lastModifiedBy` + +**Valid Colors for Select Choices:** +`blueLight2`, `blueLight1`, `blueBright`, `blue`, `blueDark1`, +`cyanLight2`, `cyanLight1`, `cyanBright`, `cyan`, `cyanDark1`, +`grayLight2`, `grayLight1`, `grayBright`, `gray`, `grayDark1`, +`greenLight2`, `greenLight1`, `greenBright`, `green`, `greenDark1`, +`orangeLight2`, `orangeLight1`, `orangeBright`, `orange`, `orangeDark1`, +`pinkLight2`, `pinkLight1`, `pinkBright`, `pink`, `pinkDark1`, +`purpleLight2`, `purpleLight1`, `purpleBright`, `purple`, `purpleDark1`, +`redLight2`, `redLight1`, `redBright`, `red`, `redDark1`, +`tealLight2`, `tealLight1`, `tealBright`, `teal`, `tealDark1`, +`yellowLight2`, `yellowLight1`, `yellowBright`, `yellow`, `yellowDark1` + +## Common Diagnostic Scenarios + +### Scenario 1: Formula Field Calculation Error + +1. View field config: `underlying field --field-id fld...` +2. Check dependencies: `relations --field-id fld... --direction down` +3. Verify dependent fields are correct + +### Scenario 2: Lookup/Rollup Data Inconsistency + +1. View lookup field config: `underlying field --field-id fld...` +2. Check `lookupOptions`: linkFieldId, foreignTableId, lookupFieldId +3. Verify the linked link field is correct + +### Scenario 3: Field Update Not Propagating + +1. Find downstream dependents: `relations --field-id fld... --direction up --level 3` +2. Check if any dependent field has errors: look for `hasError: true` +3. View specific field config: `underlying field --field-id ` + +### Scenario 4: Analyze Computed Update Performance + +1. Explain the command: `explain create --table-id tbl...` +2. Check `computedImpact.updateSteps` for the update plan +3. Cross-check dependencies using `relations` on key fields (formula/link/lookup/rollup) to confirm `reference`-derived edges are present; do not rely solely on explain output. +4. Look at `complexity.score` and `recommendations` +5. Use `--analyze` flag for actual execution timing + +### Scenario 5: Data Inconsistency Between Application and Database + +When data shown in the UI doesn't match what you expect, compare application layer and database layer: + +1. **Query via application layer (stored mode)**: + + ```bash + pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --limit 10 --mode stored + ``` + +2. **Query via application layer (computed mode)**: + + ```bash + pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --limit 10 --mode computed + ``` + +3. **Query directly from database**: + ```bash + pnpm --filter @teable/v2-devtools cli underlying records --table-id tbl... --limit 10 + ``` + +**Compare the results:** + +- If `stored` ≠ `computed`: The stored cache is stale, computed values haven't been persisted +- If `stored` ≠ `underlying`: Application layer transformation issue +- If `computed` ≠ `underlying`: Field calculation logic issue + +### Scenario 6: Creating and Managing Test Records + +When you need to quickly create, update, or delete test records for debugging: + +1. **Create a test record**: + + ```bash + pnpm --filter @teable/v2-devtools cli records create --table-id tbl... --fields '{"Name":"Test Record","Status":"Todo"}' + ``` + +2. **Update the record** (use the recordId from step 1): + + ```bash + pnpm --filter @teable/v2-devtools cli records update --table-id tbl... --record-id rec... --fields '{"Status":"Done"}' + ``` + +3. **Delete test records when done**: + ```bash + pnpm --filter @teable/v2-devtools cli records delete --table-id tbl... --record-ids rec1,rec2 + ``` + +**Tip:** Use `--typecast` when you want to input human-readable values (like link field titles instead of record IDs). + +### Scenario 7: Slow Query Performance (Missing Indexes) + +When queries are slow, especially for Link fields or tables with many records: + +1. **Check schema for the entire table**: + + ```bash + pnpm --filter @teable/v2-devtools cli schema table --table-id tbl... + ``` + +2. **Look for errors in the output**, especially: + + - `index:*` rules with `status: error` - missing index on foreign key column + - `unique_index:*` rules - missing unique index for one-to-one relations + - `junction_index:*` rules - missing indexes on junction tables (many-to-many) + +3. **Check a specific Link field**: + + ```bash + pnpm --filter @teable/v2-devtools cli schema field --table-id tbl... --field-id fldLinkField + ``` + +4. **Common missing index patterns**: + - Link field (one-to-many): Should have `index` on `fld_{fieldId}__id` column + - Link field (one-to-one): Should have `unique_index` on `fld_{fieldId}__id` column + - Link field (many-to-many): Junction table should have indexes on both FK columns + +## Global Options + +- `-c, --connection ` - Override DATABASE_URL/PRISMA_DATABASE_URL +- `--help` - Show help message + +## Connection + +Connection is resolved in the following order: + +1. `-c, --connection ` command line option +2. `PRISMA_DATABASE_URL` environment variable +3. `DATABASE_URL` environment variable +4. Default: `postgresql://teable:teable@127.0.0.1:5432/teable?schema=public` + +## PGlite Mode (Temporary Database) + +DevTools supports **pglite** for file-persisted temporary databases. This is useful for testing table creation and other operations without a real PostgreSQL server. + +### When to Use PGlite + +Use pglite (`pglite://` connection string) when: + +- **Creating temporary tables for testing** - no existing database needed +- **Testing table schema designs** before deploying to production +- **Isolated experiments** that shouldn't affect real data +- **No PostgreSQL server available** (local development without Docker) + +### When NOT to Use PGlite + +Do NOT use pglite when: + +- **User provided a real database URL** (postgresql://) +- **Verifying existing IDs** (tableId, fieldId, recordId, baseId) +- **Querying production/development data** +- **Debugging issues with real tables** + +### PGlite Connection String Format + +``` +pglite:// +``` + +Examples: + +- `pglite://.pglite-data/session-001` (relative path) +- `pglite:///absolute/path/to/data` (absolute path) + +### Using PGlite + +**Step 1: Create a pglite session** + +First, create a table with a unique pglite connection string. The CLI will automatically: + +- Create the data directory +- Initialize the database schema +- Create a space and base +- Return the generated baseId + +```bash +# Create a new pglite session with a table +pnpm --filter @teable/v2-devtools cli tables create \ + --connection "pglite://.pglite-data/session-$(date +%s)" \ + --base-id "bseXXXXXXXXXXXXX" \ + --name "Test Table" +``` + +> **Note**: For the first command, you need to provide any baseId (it will be created). Check the output for the actual baseId to use in subsequent commands. + +**Step 2: Reuse the same session** + +In the same conversation/session, **remember and reuse** the same connection string and baseId: + +```bash +# Query tables in the same pglite database +pnpm --filter @teable/v2-devtools cli underlying tables \ + --connection "pglite://.pglite-data/session-1234567890" \ + --base-id "bseXXXXXXXXXXXXX" + +# Create more tables in the same base +pnpm --filter @teable/v2-devtools cli tables create \ + --connection "pglite://.pglite-data/session-1234567890" \ + --base-id "bseXXXXXXXXXXXXX" \ + --name "Another Table" +``` + +### Important Notes for AI + +1. **Remember the session**: Store the pglite connection string and baseId for the entire conversation +2. **Data persists in files**: Data is saved to `.pglite-data/` directory (git-ignored) +3. **Isolated sessions**: Each unique path creates a separate database +4. **First-time init**: The first command to a new pglite path will initialize schema + space + base + +### Data Storage + +PGlite data is stored in: + +``` +packages/v2/devtools/.pglite-data/ +├── session-1234567890/ +│ ├── ... (pglite database files) +├── session-0987654321/ +│ └── ... +``` + +This directory is git-ignored and can be safely deleted to clean up test data. + +## Empty Data Handling + +When queries return no data, the CLI provides clear feedback: + +- `code: EMPTY_RESULT` indicates no data was found +- The error message includes hints about what to check + +**If you see EMPTY_RESULT, report to the user** that the requested data was not found in the database. diff --git a/.codex/skills/teable-v2-package-guide/SKILL.md b/.codex/skills/teable-v2-package-guide/SKILL.md new file mode 100644 index 0000000000..53f911bf84 --- /dev/null +++ b/.codex/skills/teable-v2-package-guide/SKILL.md @@ -0,0 +1,209 @@ +# Teable V2 Package Creation Guide + +## When to Use This Skill + +Use this skill when you need to: +- Create a new v2 package +- Configure package.json for v2 packages +- Set up TypeScript configuration +- Understand the v2 package conventions + +## Package Location + +All v2 packages are located in `packages/v2/` directory. + +## Creating a New Package + +### 1. Directory Structure + +``` +packages/v2// +├── src/ +│ └── index.ts # Main entry point +├── package.json +├── tsconfig.json +├── tsconfig.build.json +└── tsdown.config.ts # Optional, for custom build config +``` + +### 2. Package.json Configuration + +**IMPORTANT: Development-Friendly Exports** + +The key to avoiding rebuilds during development is the `exports` configuration: + +```json +{ + "name": "@teable/v2-", + "version": "0.0.0", + "private": true, + "license": "MIT", + "type": "module", + "sideEffects": false, + "main": "dist/index.cjs", + "module": "dist/index.js", + "types": "src/index.ts", + "exports": { + ".": { + "types": "./src/index.ts", + "import": "./src/index.ts", + "module": "./dist/index.js", + "require": "./dist/index.cjs" + } + }, + "files": [ + "dist", + "src" + ], + "scripts": { + "build": "tsdown --tsconfig tsconfig.build.json", + "dev": "tsdown --tsconfig tsconfig.build.json --watch", + "clean": "rimraf ./dist ./coverage ./tsconfig.tsbuildinfo ./tsconfig.build.tsbuildinfo ./.eslintcache", + "lint": "eslint . --ext .ts,.js,.mjs,.cjs,.mts,.cts --cache --cache-location ../../../.cache/eslint/v2-.eslintcache", + "typecheck": "tsc --project ./tsconfig.json --noEmit", + "test-unit": "vitest run --silent", + "test-unit-cover": "pnpm test-unit --coverage", + "fix-all-files": "eslint . --ext .ts,.js,.mjs,.cjs,.mts,.cts --fix" + }, + "dependencies": { + // Add your dependencies here + }, + "devDependencies": { + "@teable/v2-tsdown-config": "workspace:*", + "@teable/eslint-config-bases": "workspace:^", + "@types/node": "22.18.0", + "@vitest/coverage-v8": "4.0.16", + "eslint": "8.57.0", + "prettier": "3.2.5", + "rimraf": "5.0.5", + "tsdown": "0.18.1", + "typescript": "5.4.3", + "vite-tsconfig-paths": "4.3.2", + "vitest": "4.0.16" + } +} +``` + +### Key Export Configuration Explained + +```json +"exports": { + ".": { + "types": "./src/index.ts", // TypeScript types from source + "import": "./src/index.ts", // ESM import uses source directly + "module": "./dist/index.js", // Bundlers use built output + "require": "./dist/index.cjs" // CommonJS uses built output + } +} +``` + +**Why this works:** +- `types` and `import` point to `./src/index.ts` - this allows other packages to import TypeScript source directly during development +- `module` and `require` point to `./dist/` - this is used by bundlers and production builds +- No rebuild needed when you change source code - other packages see changes immediately + +### 3. TypeScript Configuration + +**tsconfig.json** (for development/typechecking): +```json +{ + "extends": "../../../tsconfig.base.json", + "compilerOptions": { + "outDir": "./dist", + "rootDir": "./src", + "types": ["node"] + }, + "include": ["src/**/*"], + "exclude": ["node_modules", "dist"] +} +``` + +**tsconfig.build.json** (for building): +```json +{ + "extends": "./tsconfig.json", + "compilerOptions": { + "declaration": true, + "declarationMap": true, + "sourceMap": true + }, + "exclude": ["node_modules", "dist", "**/*.spec.ts", "**/*.test.ts"] +} +``` + +### 4. Build Configuration (Optional) + +**tsdown.config.ts**: +```typescript +import { defineConfig } from '@teable/v2-tsdown-config'; + +export default defineConfig(); +``` + +## Common Patterns + +### Workspace Dependencies + +Use `workspace:*` for internal dependencies: +```json +"dependencies": { + "@teable/v2-core": "workspace:*", + "@teable/v2-di": "workspace:*" +} +``` + +### Dependency Injection + +V2 packages use `@teable/v2-di` for DI: +```typescript +import { injectable, inject } from '@teable/v2-di'; + +@injectable() +export class MyService { + constructor( + @inject(someToken) private readonly dependency: SomeDependency + ) {} +} +``` + +### Error Handling + +Use `neverthrow` for Result types: +```typescript +import { ok, err, Result } from 'neverthrow'; +import { domainError, type DomainError } from '@teable/v2-core'; + +function doSomething(): Result { + if (error) { + return err(domainError.invariant({ message: 'Something went wrong' })); + } + return ok(data); +} +``` + +## Checklist for New Package + +- [ ] Create directory `packages/v2//` +- [ ] Create `src/index.ts` with exports +- [ ] Configure `package.json` with correct exports (pointing to source) +- [ ] Create `tsconfig.json` and `tsconfig.build.json` +- [ ] Add to root `pnpm-workspace.yaml` if needed +- [ ] Run `pnpm install` to link workspace dependencies +- [ ] Verify imports work without building: `pnpm typecheck` + +## Troubleshooting + +### "Cannot find module" errors +1. Check that `exports` in `package.json` points to correct paths +2. Ensure `types` field points to `src/index.ts` +3. Run `pnpm install` to refresh workspace links + +### Changes not reflected in other packages +1. Verify `exports.import` points to `./src/index.ts` (not `./dist/`) +2. Check if the consuming package has cached the old build +3. Restart TypeScript server in your IDE + +### Build errors +1. Ensure `tsconfig.build.json` excludes test files +2. Check that all dependencies are properly declared +3. Run `pnpm clean` before rebuilding diff --git a/.codex/skills/teable-v2-table-template/SKILL.md b/.codex/skills/teable-v2-table-template/SKILL.md new file mode 100644 index 0000000000..21c211d29f --- /dev/null +++ b/.codex/skills/teable-v2-table-template/SKILL.md @@ -0,0 +1,65 @@ +--- +name: teable-v2-table-template +description: Create or update Teable v2 table templates in packages/v2/table-templates (template seeds, fields, records, and exports). +--- + +# Teable v2 Table Template Skill + +Use this skill when you need to add or modify table templates in the v2 codebase. Templates live in `packages/v2/table-templates/src/index.ts`. + +## Quick workflow + +1. Open the template source file: `packages/v2/table-templates/src/index.ts`. +2. Add a seed builder: + - Single table: create a `createXSeed(): SingleTableSeed`. + - Multi table: create a `createXTemplateSeed(): TemplateSeed`. +3. Use helpers for IDs and select options: + - `createFieldId()` for field IDs + - `createTableId()` if you must predefine table IDs + - `createSelectOption()` for single/multi-select choices +4. Create the template definition: + - Single table: `singleTable(key, name, description, createXSeed, defaultRecordCount)` + - Multi table: `createTemplate(key, name, description, createXTemplateSeed, defaultRecordCount)` +5. Export the template and add it to `tableTemplates` array. +6. If you need a field-only helper, export `createXFields = () => createXSeed().fields;`. + +## Notes and conventions + +- Keep templates in `packages/v2/table-templates/src/index.ts` (this package is the single source of truth). +- The `createInput` generator in `TableTemplateDefinition` handles optional record seeding and name prefixing. You only need to supply seed fields and records. +- Prefer `singleTable(...)` unless the template truly needs multiple tables (e.g., CRM with Companies + Contacts). +- Use string keys that are stable and URL-safe (e.g., `content-calendar`, `bug-triage`). +- When seeding records, keep records small and representative; use `normalizeTemplateRecords` behavior to cap or pad. +- New templates should cover as many field types as possible, as long as the business context makes sense (use `allFieldTypesTemplate` for inspiration). + +## Example pattern + +```ts +const createMyTemplateSeed = (): SingleTableSeed => { + const nameFieldId = createFieldId(); + return { + fields: [{ type: 'singleLineText', id: nameFieldId, name: 'Name' }], + records: [{ fields: { [nameFieldId]: 'Example' } }], + }; +}; + +export const myTemplate = singleTable( + 'my-template', + 'My Template', + 'Short description.', + createMyTemplateSeed, + 1 +); + +export const tableTemplates = [ + // ...existing templates, + myTemplate, +] as const; +``` + +## References + +- Source of truth: `packages/v2/table-templates/src/index.ts` +- Package note: `packages/v2/table-templates/ARCHITECTURE.md` +- E2E contract: ensure `creates tables for every template with seeded records` passes in `packages/v2/e2e`. +- Suggested run: `pnpm -C packages/v2/e2e test -- --runInBand --testNamePattern "creates tables for every template with seeded records"` diff --git a/.codex/skills/teable-v2-test-debug/SKILL.md b/.codex/skills/teable-v2-test-debug/SKILL.md new file mode 100644 index 0000000000..4282d0de6f --- /dev/null +++ b/.codex/skills/teable-v2-test-debug/SKILL.md @@ -0,0 +1,73 @@ +--- +name: teable-v2-test-debug +description: Debug Teable v2 tests and failing test cases by prioritizing data reproduction and inspection. Use when asked to debug a test file/spec (unit, integration, or e2e) in packages/v2/*, especially when failures might be caused by table schema, relations, or stored/computed data drift; workflow uses v2-devtools CLI to create a similar table first, then inspects real DB data/relations, and only then reviews code logic. +--- + +# Teable V2 Test Debug + +## Overview + +Follow a data-first debugging workflow for Teable v2 tests. The default order is: reproduce data with devtools, inspect real DB data/relations, then analyze code logic. + +## Workflow: Data-first test debugging + +### 1) Capture failure context + +- Identify the failing test name, file path, and the exact assertion that failed. +- Note the expected vs actual values and any IDs shown in logs (base/table/field/record). +- If the failure is e2e or integration, confirm which base or seed data was used. + +### 2) Reproduce with devtools first (create similar table) + +- Use the v2-devtools CLI to create a minimal table that mirrors the test schema. +- Prefer CLI-based table creation and mock data over hand-written SQL. +- If the schema is complex, build only the fields involved in the failing assertion. + +Common commands: + +```bash +# Get field schema documentation before creating tables +pnpm --filter @teable/v2-devtools cli tables describe-schema + +# Create a table with minimal fields +pnpm --filter @teable/v2-devtools cli tables create --base-id bse... --name "Test Table" --fields '[{"type":"singleLineText","name":"Name","isPrimary":true}]' + +# Generate mock records if data shape matters +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 10 --seed 12345 +``` + +If the v2-devtools skill exists, open `/Users/nichenqin/projects/teable/.codex/skills/teable-v2-devtools/SKILL.md` for the full command reference and validation rules. + +### 3) Inspect real DB data and relations + +- Compare application layer vs underlying data first; use stored/computed modes. +- Inspect dependencies and relations to confirm lookup/rollup/formula inputs. +- Validate schema constraints if missing indexes or FK columns are suspected. + +Common commands: + +```bash +# App-layer data (stored/computed) vs underlying +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --mode stored --limit 10 +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --mode computed --limit 10 +pnpm --filter @teable/v2-devtools cli underlying records --table-id tbl... --limit 10 + +# Inspect a field and its dependencies +pnpm --filter @teable/v2-devtools cli underlying field --field-id fld... +pnpm --filter @teable/v2-devtools cli relations --field-id fld... --direction up --level 2 + +# Check schema integrity if queries are slow or failing +pnpm --filter @teable/v2-devtools cli schema table --table-id tbl... +``` + +### 4) Only then review code logic + +- Map the observed data mismatch back to the handler, visitor, or mapper. +- Verify spec/visitor logic before touching application wiring. +- If the bug is only reproducible with real DB data, prefer adjusting fixtures or seeding rather than altering logic. + +### 5) Decide next action + +- If app-layer vs underlying differs, focus on computed/stored pipeline and mappers. +- If dependencies are wrong, fix field definitions or relation setup first. +- If reproduction fails on minimal data, debug core logic with a tight fixture. diff --git a/.dockerignore b/.dockerignore index 91cf73c486..b8697aec0d 100644 --- a/.dockerignore +++ b/.dockerignore @@ -45,8 +45,10 @@ tmp # other **/db +!packages/v2/adapter-postgres-state/src/db +!packages/v2/adapter-postgres-state/src/db/** **/.assets **/.temporary **.DS_Store docs -**/*.md \ No newline at end of file +**/*.md diff --git a/.github/workflows/integration-tests.yml b/.github/workflows/integration-tests.yml index 0dae4515f2..3930e83080 100644 --- a/.github/workflows/integration-tests.yml +++ b/.github/workflows/integration-tests.yml @@ -16,6 +16,7 @@ concurrency: jobs: test: + if: github.ref != 'refs/heads/refactor/core' && github.head_ref != 'refactor/core' runs-on: ubuntu-latest name: Integration Tests @@ -61,6 +62,7 @@ jobs: finish: needs: test + if: github.ref != 'refs/heads/refactor/core' && github.head_ref != 'refactor/core' runs-on: ubuntu-latest steps: - name: Coveralls Finished diff --git a/.github/workflows/issue-id-check.yml b/.github/workflows/issue-id-check.yml index caea189d38..8a521b1e6b 100644 --- a/.github/workflows/issue-id-check.yml +++ b/.github/workflows/issue-id-check.yml @@ -15,6 +15,7 @@ concurrency: jobs: check-issue-ids: + if: github.ref != 'refs/heads/refactor/core' && github.head_ref != 'refactor/core' runs-on: ubuntu-latest name: Check Issue IDs diff --git a/.github/workflows/linting.yml b/.github/workflows/linting.yml index 1460568160..02548e5393 100644 --- a/.github/workflows/linting.yml +++ b/.github/workflows/linting.yml @@ -13,6 +13,7 @@ concurrency: cancel-in-progress: true jobs: build: + if: github.ref != 'refs/heads/refactor/core' && github.head_ref != 'refactor/core' runs-on: ubuntu-latest name: Linting and Types diff --git a/.github/workflows/unit-tests.yml b/.github/workflows/unit-tests.yml index eb6e82e2d0..e5c1f2b189 100644 --- a/.github/workflows/unit-tests.yml +++ b/.github/workflows/unit-tests.yml @@ -17,6 +17,7 @@ concurrency: cancel-in-progress: true jobs: test: + if: github.ref != 'refs/heads/refactor/core' && github.head_ref != 'refactor/core' runs-on: ubuntu-latest name: Unit Tests diff --git a/.github/workflows/v2-benchmark-tests.yml b/.github/workflows/v2-benchmark-tests.yml new file mode 100644 index 0000000000..be0095744e --- /dev/null +++ b/.github/workflows/v2-benchmark-tests.yml @@ -0,0 +1,74 @@ +name: V2 Benchmarks + +on: + workflow_dispatch: + pull_request: + branches: + - develop + paths: + - 'packages/v2/**' + - '.github/workflows/v2-benchmark-tests.yml' + +concurrency: + group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }} + cancel-in-progress: true + +jobs: + bench: + if: github.ref == 'refs/heads/refactor/core' || github.head_ref == 'refactor/core' + runs-on: ubuntu-latest + name: V2 Benchmarks + env: + CI: 1 + TESTCONTAINERS_REUSE_ENABLE: 'false' + + strategy: + matrix: + node-version: [22.18.0] + + steps: + - uses: actions/checkout@v4 + + - name: Use Node.js ${{ matrix.node-version }} + uses: actions/setup-node@v4 + with: + node-version: ${{ matrix.node-version }} + + - name: 📥 Monorepo install + uses: ./.github/actions/pnpm-install + + - name: 🧪 Run v2 benchmarks + run: | + pnpm -C packages/v2/benchmark-node bench + + bench-bun: + if: github.ref == 'refs/heads/refactor/core' || github.head_ref == 'refactor/core' + runs-on: ubuntu-latest + name: V2 Benchmarks (Bun) + env: + CI: 1 + TESTCONTAINERS_REUSE_ENABLE: 'false' + + strategy: + matrix: + node-version: [22.18.0] + + steps: + - uses: actions/checkout@v4 + + - name: Use Node.js ${{ matrix.node-version }} + uses: actions/setup-node@v4 + with: + node-version: ${{ matrix.node-version }} + + - name: Use Bun + uses: oven-sh/setup-bun@v1 + with: + bun-version: 'latest' + + - name: 📥 Monorepo install + uses: ./.github/actions/pnpm-install + + - name: 🧪 Run v2 bun benchmarks + run: | + pnpm -C packages/v2/benchmark-bun bench diff --git a/.github/workflows/v2-core-tests.yml b/.github/workflows/v2-core-tests.yml new file mode 100644 index 0000000000..16cc1d4ad0 --- /dev/null +++ b/.github/workflows/v2-core-tests.yml @@ -0,0 +1,88 @@ +name: V2 Tests + +on: + pull_request: + branches: + - develop + +concurrency: + group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }} + cancel-in-progress: true + +jobs: + # Unit tests - run each package in parallel + unit-tests: + if: github.ref == 'refs/heads/refactor/core' || github.head_ref == 'refactor/core' + runs-on: ubuntu-latest + name: V2 Unit Tests (${{ matrix.package }}) + env: + CI: 1 + TESTCONTAINERS_REUSE_ENABLE: 'false' + + strategy: + fail-fast: false + max-parallel: 6 + matrix: + package: + - '@teable/v2-adapter-db-postgres-pg' + - '@teable/v2-adapter-repository-postgres' + - '@teable/v2-adapter-table-repository-postgres' + - '@teable/v2-core' + - '@teable/v2-formula-sql-pg' + - '@teable/v2-test-node' + + steps: + - uses: actions/checkout@v4 + + - name: Use Node.js 22.18.0 + uses: actions/setup-node@v4 + with: + node-version: 22.18.0 + + - name: 📥 Monorepo install + uses: ./.github/actions/pnpm-install + with: + filter: ${{ matrix.package }} + + - name: 🧪 Run unit tests (${{ matrix.package }}) + run: | + pnpm -F "${{ matrix.package }}" --if-present test-unit-cover + + # E2E tests - use sharding for parallel execution (the slowest tests) + e2e-tests: + if: github.ref == 'refs/heads/refactor/core' || github.head_ref == 'refactor/core' + runs-on: ubuntu-latest + name: V2 E2E Tests (Shard ${{ matrix.shard }}/4) + env: + CI: 1 + TESTCONTAINERS_REUSE_ENABLE: 'false' + + strategy: + fail-fast: false + matrix: + shard: [1, 2, 3, 4] + + steps: + - uses: actions/checkout@v4 + + - name: Use Node.js 22.18.0 + uses: actions/setup-node@v4 + with: + node-version: 22.18.0 + + - name: 📥 Monorepo install + uses: ./.github/actions/pnpm-install + with: + filter: '@teable/v2-e2e' + + - name: 🧪 Run E2E tests (shard ${{ matrix.shard }}/4) + run: | + pnpm -C packages/v2/e2e test-e2e -- --shard=${{ matrix.shard }}/4 --reporter=json --reporter=default --outputFile=e2e-report-${{ matrix.shard }}.json + + - name: 📊 Upload test report + if: always() + uses: actions/upload-artifact@v4 + with: + name: e2e-report-shard-${{ matrix.shard }} + path: packages/v2/e2e/e2e-report-${{ matrix.shard }}.json + retention-days: 7 diff --git a/.gitignore b/.gitignore index cdea770aa2..254d2e09c5 100644 --- a/.gitignore +++ b/.gitignore @@ -29,15 +29,20 @@ node_modules /build /dist/ +# v2 packages build output +packages/v2/**/dist/ + # Cache *.tsbuildinfo **/.eslintcache .cache/* .swc/ +apps/playground/src/routeTree.gen.ts # Misc .DS_Store *.pem +.worktrees/ # Debug npm-debug.log* @@ -70,4 +75,4 @@ pnpm-debug.log* # LocalStorage assets -**/.assets \ No newline at end of file +**/.assets diff --git a/.opencode/command/explain-table-crud.md b/.opencode/command/explain-table-crud.md new file mode 100644 index 0000000000..c3384bc843 --- /dev/null +++ b/.opencode/command/explain-table-crud.md @@ -0,0 +1,59 @@ +--- +description: Explain table CRUD and analyze computed + SQL +--- + +You are debugging a table by running Teable v2 DevTools explain commands. + +Use the provided id: `$1`. +Use the provided database URL: `$2`. + +If a database URL is provided (for example as `$2`), use it for DevTools commands; otherwise assume the default local URL (localhost/.env). +If the table cannot be found, first question whether the database URL or connection is wrong. + +0. Determine whether `$1` is a base id or a table id: + + - RUN `pnpm --filter @teable/v2-devtools cli bases get --base-id $1` + - If the base exists, treat `$1` as a base id. + - If the base does not exist, treat `$1` as a table id. + + If `$1` is a base id, collect all table ids and execute the remaining steps for each table: + + - RUN `pnpm --filter @teable/v2-devtools cli tables list --base-id $1` + - Create a per-table plan (one section per table id) and run steps 1-4 for every table. + +1. Pre-analyze table structure and references before explain: + + - RUN `pnpm --filter @teable/v2-devtools cli underlying fields --table-id ` + - Identify link fields (`type: link`), lookup/rollup/formula dependencies, and any fields with `reference`/relation hints. + - For link fields, check `options.relationship` in the underlying field output to decide shape: + - `oneMany`/`manyMany` => array of `{ id }` + - `oneOne`/`manyOne` => single `{ id }` + - RUN `pnpm --filter @teable/v2-devtools cli relations --field-id --direction both --level 2` for each link field. + - If a normal field update impacts other tables via relations, list those dependent fields explicitly. + +2. Fetch a sample record and candidate fields: + + - RUN `pnpm --filter @teable/v2-devtools cli records list --table-id --limit 1 --mode stored` + - If no records exist, create one, then re-run list: + - RUN `pnpm --filter @teable/v2-devtools cli mock generate --table-id --count 1` + - RUN `pnpm --filter @teable/v2-devtools cli records get --table-id --record-id ` + - Use the `records get` output to see the exact JSON shape for link fields; mirror that shape in the update payload. + - Example (replace with actual link field name and record IDs from `records get`): + - If link field is array-like (one-many, many-many): `"LinkFieldName": [{"id": "recForeign1"}, {"id": "recForeign2"}]` + - If link field is single object (one-one, many-one): `"LinkFieldName": {"id": "recForeign1"}` + - To remove all links, set the link field to `[]` or `null` depending on shape. + - Pick writable fields for update that include: + - At least one link field update (based on link field schema + sample record value shape) + - Any non-link fields that propagate to other tables + +3. Explain CRUD commands with analyze enabled: + + - RUN `pnpm --filter @teable/v2-devtools cli explain create --table-id --analyze` + - RUN `pnpm --filter @teable/v2-devtools cli explain update --table-id --record-id --fields '' --analyze` + - RUN `pnpm --filter @teable/v2-devtools cli explain delete --table-id --record-ids --analyze` + +4. Prioritized analysis: + - First check for any errors in explain outputs (including SQL explain errors). + - If errors exist: list them clearly, then propose a concrete fix plan. + - If no errors: inspect `computedImpact` (dependency graph + update steps) and cross-check with `relations` to see if any computed dependencies are missing. If `relations` is empty but `dependencyGraph.edges` shows links, call it out. + - Finally analyze SQL performance using `sqlExplains` and `complexity` (highlight slow steps, missing indexes, or high-cost patterns). diff --git a/.opencode/command/git-commit.md b/.opencode/command/git-commit.md new file mode 100644 index 0000000000..6407d3f49a --- /dev/null +++ b/.opencode/command/git-commit.md @@ -0,0 +1,18 @@ +--- +description: Draft git commit command +--- + +# Draft git commit command + +RUN git status --short +RUN git diff +RUN git diff --staged +RUN git log -5 --oneline + +Analyze the changes and propose a concise commit title and body (1–2 sentences) that match repo style. If there are no changes, say so. + +Add relevant untracked files and stage changes, then create the commit with the generated title and body using: +`git add -A` +`git commit -m "" -m "<body>"` + +Do NOT push. diff --git a/.opencode/skills/teable-v2-devtools/SKILL.md b/.opencode/skills/teable-v2-devtools/SKILL.md new file mode 100644 index 0000000000..101aeff4ee --- /dev/null +++ b/.opencode/skills/teable-v2-devtools/SKILL.md @@ -0,0 +1,609 @@ +name: teable-v2-devtools +description: Teable v2 developer tools CLI for debugging, inspecting, and generating test data. Combines debug-data and mock-records capabilities into a unified CLI using Effect CLI framework. + +--- + +# Teable V2 DevTools CLI + +## When to Use This Skill + +Use this skill when you need to: + +- View table/field configuration details +- Diagnose formula/lookup/rollup issues +- Understand field dependency relationships +- Analyze computed field update plans (explain commands) +- Generate mock/test data for tables +- **Query records data** (via application layer or direct database access) +- **Create, update, delete records** (via application layer commands) +- **Check database schema** (indexes, constraints, columns) for missing or broken indexes +- **Create tables** (via CLI, without records) + +> **Important**: When you need to inspect database data, **prefer DevTools CLI over psql**. DevTools outputs structured TOON format, which is easier for AI analysis and supports comparing application-layer and database-layer results. + +## Development Notes (CRITICAL) + +### Rebuild After Modifying Dependencies + +DevTools CLI depends on multiple v2 packages, and it uses **compiled dist outputs** rather than TypeScript sources. This is because it relies on parameter decorators (`@inject()`), which tsx/esbuild cannot run directly. + +**If you modify any of the following packages, you must rebuild them before running the CLI:** + +```bash +# If you modify adapter-table-repository-postgres +pnpm --filter @teable/v2-adapter-table-repository-postgres build + +# If you modify command-explain +pnpm --filter @teable/v2-command-explain build + +# If you modify debug-data +pnpm --filter @teable/v2-debug-data build + +# If you modify core +pnpm --filter @teable/v2-core build +``` + +**Recommended: use watch mode for auto-rebuilds** + +```bash +# Start watch mode in a separate terminal +pnpm --filter @teable/v2-adapter-table-repository-postgres dev +``` + +**Common pitfalls:** + +- CLI output doesn’t change after code edits → forgot to rebuild +- console.log/console.error never prints → forgot to rebuild +- newly added types/functions missing → forgot to rebuild + +## Quick Commands + +All commands output TOON format for AI consumption. + +### Debug Commands + +```bash +# View underlying table metadata +pnpm --filter @teable/v2-devtools cli underlying table --table-id tbl... + +# List all tables in a base +pnpm --filter @teable/v2-devtools cli underlying tables --base-id bse... + +# View field configuration (diagnose formula issues) +pnpm --filter @teable/v2-devtools cli underlying field --field-id fld... + +# List all fields in a table +pnpm --filter @teable/v2-devtools cli underlying fields --table-id tbl... + +# View field dependencies (diagnose computed field propagation) +pnpm --filter @teable/v2-devtools cli relations --field-id fld... --direction up --level 2 + +# Explain CreateRecord (analyze computed update plan) +pnpm --filter @teable/v2-devtools cli explain create --table-id tbl... + +# Explain UpdateRecord +pnpm --filter @teable/v2-devtools cli explain update --table-id tbl... --record-id rec... --fields '{"Name":"test"}' + +# Explain DeleteRecords +pnpm --filter @teable/v2-devtools cli explain delete --table-id tbl... --record-ids rec1,rec2 +``` + +### Schema Check Commands + +Use these commands to verify database schema integrity, especially when you suspect missing indexes might be causing slow queries. + +```bash +# Check all fields in a table for missing indexes, constraints, columns +pnpm --filter @teable/v2-devtools cli schema table --table-id tbl... + +# Check a specific field for missing schema elements +pnpm --filter @teable/v2-devtools cli schema field --table-id tbl... --field-id fld... +``` + +### Records Query Commands + +```bash +# List records via application layer (stored mode - pre-computed values) +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --limit 100 --offset 0 + +# List records via application layer (computed mode - calculated on-the-fly) +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --mode computed + +# Get single record via application layer +pnpm --filter @teable/v2-devtools cli records get --table-id tbl... --record-id rec... + +# List records directly from underlying PostgreSQL table (raw data) +pnpm --filter @teable/v2-devtools cli underlying records --table-id tbl... --limit 100 + +# Get single record directly from underlying PostgreSQL table +pnpm --filter @teable/v2-devtools cli underlying record --table-id tbl... --record-id rec... +``` + +### Records Mutation Commands + +```bash +# Create a new record +pnpm --filter @teable/v2-devtools cli records create --table-id tbl... --fields '{"Name":"New Record"}' + +# Create a record with typecast (auto-convert values) +pnpm --filter @teable/v2-devtools cli records create --table-id tbl... --fields '{"Name":"Test"}' --typecast + +# Update an existing record +pnpm --filter @teable/v2-devtools cli records update --table-id tbl... --record-id rec... --fields '{"Name":"Updated Name"}' + +# Update with typecast +pnpm --filter @teable/v2-devtools cli records update --table-id tbl... --record-id rec... --fields '{"Status":"Done"}' --typecast + +# Delete records (comma-separated IDs) +pnpm --filter @teable/v2-devtools cli records delete --table-id tbl... --record-ids rec1,rec2,rec3 +``` + +### Mock Data Commands + +```bash +# Generate 100 mock records +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 100 + +# Generate with reproducible seed +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 50 --seed 12345 + +# Dry run (preview without inserting) +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 10 --dry-run +``` + +### Table Management Commands + +```bash +# Create a simple table with default fields (just a primary Name field) +pnpm --filter @teable/v2-devtools cli tables create --base-id bse... --name "My Table" + +# Create a table with custom fields +pnpm --filter @teable/v2-devtools cli tables create --base-id bse... --name "Tasks" --fields '[{"type":"singleLineText","name":"Title","isPrimary":true},{"type":"singleSelect","name":"Status","options":{"choices":[{"name":"Todo"},{"name":"Done"}]}},{"type":"date","name":"Due Date"}]' +``` + +## Command Reference + +### underlying Commands + +| Command | Description | +| ---------------------------------------------------- | -------------------------------------------------------------------- | +| `underlying table --table-id <id>` | Get raw table metadata | +| `underlying tables --base-id <id>` | List all tables in a base | +| `underlying field --field-id <id>` | Get field metadata (includes parsed options/meta JSON) | +| `underlying fields --table-id <id>` | List all fields in a table | +| `underlying records --table-id <id>` | List records directly from PostgreSQL (raw data with system columns) | +| `underlying record --table-id <id> --record-id <id>` | Get single record directly from PostgreSQL | + +### records Commands (Application Layer) + +| Command | Description | +| ----------------------------------------------------------------- | ----------------------------------------- | +| `records list --table-id <id>` | List records via query repository | +| `records get --table-id <id> --record-id <id>` | Get single record via query repository | +| `records create --table-id <id> --fields <json>` | Create a new record via command bus | +| `records update --table-id <id> --record-id <id> --fields <json>` | Update an existing record via command bus | +| `records delete --table-id <id> --record-ids <ids>` | Delete records via command bus | + +**Records Query Options:** +| Option | Description | +|--------|-------------| +| `--table-id <id>` | Required: Table ID | +| `--record-id <id>` | Required for get: Record ID | +| `--limit <n>` | Max records to return (default: 100) | +| `--offset <n>` | Records to skip (default: 0) | +| `--mode stored\|computed` | Query mode (default: stored) | + +**Mode Explanation:** + +- `stored`: Read pre-computed values from the database (fast, uses cached values) +- `computed`: Calculate field values on-the-fly (slower, always fresh) + +**Records Mutation Options:** +| Option | Description | +|--------|-------------| +| `--table-id <id>` | Required: Table ID | +| `--record-id <id>` | Required for update: Record ID | +| `--record-ids <ids>` | Required for delete: Comma-separated record IDs | +| `--fields <json>` | JSON object of field values (required for update, optional for create) | +| `--typecast` | Enable typecast mode to auto-convert values (default: false) | + +**Typecast Mode:** + +When `--typecast` is enabled, the system will attempt to convert input values to the correct field types: + +- String "123" → Number 123 +- Link field titles → Link field record IDs +- Date strings → Date objects + +### relations Command + +| Option | Description | +| ---------------------------- | ------------------------------------------------------------------- | +| `--field-id <id>` | Required: Starting field ID | +| `--direction up\|down\|both` | `up` = who depends on me, `down` = what I depend on (default: both) | +| `--level <n>` | Max traversal depth (default: unlimited) | +| `--same-table` | Only traverse same-table relations | + +### schema Commands + +Use these commands when analyzing slow queries or suspecting missing indexes. + +| Command | Description | +| ---------------------------------------------- | --------------------------------------------- | +| `schema table --table-id <id>` | Check all fields in a table for schema issues | +| `schema field --table-id <id> --field-id <id>` | Check a specific field for schema issues | + +**Schema Check Output:** + +The output includes a summary with: + +- `total`: Total number of schema rules checked +- `success`: Rules that passed validation +- `errors`: Critical issues (missing indexes, columns, constraints) +- `warnings`: Non-critical issues + +Each result item includes: + +- `fieldId`, `fieldName`: The field being checked +- `ruleId`: Type of rule (e.g., `index`, `unique_index`, `fk_column`, `fk`) +- `ruleDescription`: Human-readable description +- `status`: `success`, `error`, or `warn` +- `message`: Details about the issue +- `details.missing`: List of missing schema objects (index names, column names, etc.) + +**Rule Types Checked:** +| Rule Type | Description | +|-----------|-------------| +| `column` | Physical column exists | +| `fk_column` | Foreign key column exists | +| `index` | Non-unique index exists (for FK lookups) | +| `unique_index` | Unique index exists (for one-to-one relations) | +| `fk` | Foreign key constraint exists | +| `junction_table` | Junction table exists (many-to-many) | +| `junction_index` | Junction table indexes exist | +| `junction_fk` | Junction table foreign keys exist | +| `generated_column` | Generated column (auto-number, created_time, etc.) | + +### explain Commands + +| Command | Description | +| ----------------------------------------------------------------- | ----------------------------- | +| `explain create --table-id <id>` | Explain CreateRecord command | +| `explain update --table-id <id> --record-id <id> --fields <json>` | Explain UpdateRecord command | +| `explain delete --table-id <id> --record-ids <ids>` | Explain DeleteRecords command | + +**Explain Options:** +| Option | Description | +|--------|-------------| +| `--table-id <id>` | Required: Table ID | +| `--record-id <id>` | Required for update: Record ID | +| `--record-ids <ids>` | Required for delete: Comma-separated record IDs | +| `--fields <json>` | JSON object of field values (required for update, optional for create) | +| `--analyze` | Run EXPLAIN ANALYZE for actual execution stats (default: false) | + +### mock Commands + +| Option | Description | +| ------------------ | --------------------------------------------------------- | +| `--table-id <id>` | Required: Table ID to generate records for | +| `--count <n>` | Required: Number of records to generate | +| `--seed <n>` | Optional: Seed for reproducible random data | +| `--batch-size <n>` | Optional: Batch size for insertion (default: 100) | +| `--dry-run` | Optional: Only show what would be generated, don't insert | + +**Supported Field Types for Mock Data:** + +| Field Type | Generated Data | +| -------------- | ------------------------------------------ | +| SingleLineText | Names/emails/URLs/phones (based on showAs) | +| LongText | Lorem ipsum paragraphs | +| Number | Random floats 0-1000 | +| Rating | Random integers 1 to max rating | +| SingleSelect | Random selection from options | +| MultipleSelect | 1-3 random options | +| Checkbox | Random boolean | +| Date | Recent date within 365 days | +| User | Mock user object `{id, title, email}` | +| Attachment | Mock attachment objects | +| Link | Random IDs from linked table | + +### tables Commands + +| Command | Description | +| -------------------------------------------- | ------------------------------------------------------ | +| `tables create --base-id <id> --name <name>` | Create a new table (without records) | +| `tables describe-schema` | **Output field schema documentation for AI reference** | + +> **Important**: Before creating tables, **you must run `tables describe-schema`** to get the full field schema documentation and avoid validation errors. + +**tables create Options:** +| Option | Description | +|--------|-------------| +| `--base-id <id>` | Required: Base ID where table will be created | +| `--name <name>` | Required: Table name | +| `--fields <json>` | Optional: JSON array of field definitions | + +**Critical validation rules (must follow):** + +1. **SingleSelect/MultipleSelect choices must include a `color` property** - e.g. `{"name": "Todo", "color": "blueLight1"}` +2. **Link fields must include `foreignTableId` and `lookupFieldId`** - query the target table to get these IDs first +3. **Each table can only have one field with `isPrimary: true`** + +**Field Definition Format:** + +```json +[ + { "type": "singleLineText", "name": "Title", "isPrimary": true }, + { "type": "number", "name": "Amount" }, + { "type": "date", "name": "Due Date" }, + { + "type": "singleSelect", + "name": "Status", + "options": { + "choices": [ + { "name": "Todo", "color": "grayLight1" }, + { "name": "Done", "color": "greenLight1" } + ] + } + }, + { "type": "checkbox", "name": "Completed" } +] +``` + +**Link Field Example:** + +```json +{ + "type": "link", + "name": "Company", + "options": { + "relationship": "manyOne", + "foreignTableId": "tblXXXXXXXX", + "lookupFieldId": "fldYYYYYYYY" + } +} +``` + +- `relationship`: `oneOne` (1:1), `oneMany` (1:N), `manyOne` (N:1), `manyMany` (N:N) +- `lookupFieldId`: Primary field ID of the foreign table (usually the first field) + +**Supported Field Types:** + +- `singleLineText`, `longText`, `number`, `date`, `checkbox` +- `singleSelect`, `multipleSelect` (requires `options.choices` with color) +- `rating`, `attachment`, `user` +- `link` (requires `options.foreignTableId`, `options.lookupFieldId`, `options.relationship`) +- `formula`, `rollup`, `lookup` (computed fields) +- `autoNumber`, `createdTime`, `lastModifiedTime`, `createdBy`, `lastModifiedBy` + +**Valid Colors for Select Choices:** +`blueLight2`, `blueLight1`, `blueBright`, `blue`, `blueDark1`, +`cyanLight2`, `cyanLight1`, `cyanBright`, `cyan`, `cyanDark1`, +`grayLight2`, `grayLight1`, `grayBright`, `gray`, `grayDark1`, +`greenLight2`, `greenLight1`, `greenBright`, `green`, `greenDark1`, +`orangeLight2`, `orangeLight1`, `orangeBright`, `orange`, `orangeDark1`, +`pinkLight2`, `pinkLight1`, `pinkBright`, `pink`, `pinkDark1`, +`purpleLight2`, `purpleLight1`, `purpleBright`, `purple`, `purpleDark1`, +`redLight2`, `redLight1`, `redBright`, `red`, `redDark1`, +`tealLight2`, `tealLight1`, `tealBright`, `teal`, `tealDark1`, +`yellowLight2`, `yellowLight1`, `yellowBright`, `yellow`, `yellowDark1` + +## Common Diagnostic Scenarios + +### Scenario 1: Formula Field Calculation Error + +1. View field config: `underlying field --field-id fld...` +2. Check dependencies: `relations --field-id fld... --direction down` +3. Verify dependent fields are correct + +### Scenario 2: Lookup/Rollup Data Inconsistency + +1. View lookup field config: `underlying field --field-id fld...` +2. Check `lookupOptions`: linkFieldId, foreignTableId, lookupFieldId +3. Verify the linked link field is correct + +### Scenario 3: Field Update Not Propagating + +1. Find downstream dependents: `relations --field-id fld... --direction up --level 3` +2. Check if any dependent field has errors: look for `hasError: true` +3. View specific field config: `underlying field --field-id <dependent-field-id>` + +### Scenario 4: Analyze Computed Update Performance + +1. Explain the command: `explain create --table-id tbl...` +2. Check `computedImpact.updateSteps` for the update plan +3. Cross-check dependencies using `relations` on key fields (formula/link/lookup/rollup) to confirm `reference`-derived edges are present; do not rely solely on explain output. +4. Look at `complexity.score` and `recommendations` +5. Use `--analyze` flag for actual execution timing + +### Scenario 5: Data Inconsistency Between Application and Database + +When data shown in the UI doesn't match what you expect, compare application layer and database layer: + +1. **Query via application layer (stored mode)**: + + ```bash + pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --limit 10 --mode stored + ``` + +2. **Query via application layer (computed mode)**: + + ```bash + pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --limit 10 --mode computed + ``` + +3. **Query directly from database**: + ```bash + pnpm --filter @teable/v2-devtools cli underlying records --table-id tbl... --limit 10 + ``` + +**Compare the results:** + +- If `stored` ≠ `computed`: The stored cache is stale, computed values haven't been persisted +- If `stored` ≠ `underlying`: Application layer transformation issue +- If `computed` ≠ `underlying`: Field calculation logic issue + +### Scenario 6: Creating and Managing Test Records + +When you need to quickly create, update, or delete test records for debugging: + +1. **Create a test record**: + + ```bash + pnpm --filter @teable/v2-devtools cli records create --table-id tbl... --fields '{"Name":"Test Record","Status":"Todo"}' + ``` + +2. **Update the record** (use the recordId from step 1): + + ```bash + pnpm --filter @teable/v2-devtools cli records update --table-id tbl... --record-id rec... --fields '{"Status":"Done"}' + ``` + +3. **Delete test records when done**: + ```bash + pnpm --filter @teable/v2-devtools cli records delete --table-id tbl... --record-ids rec1,rec2 + ``` + +**Tip:** Use `--typecast` when you want to input human-readable values (like link field titles instead of record IDs). + +### Scenario 7: Slow Query Performance (Missing Indexes) + +When queries are slow, especially for Link fields or tables with many records: + +1. **Check schema for the entire table**: + + ```bash + pnpm --filter @teable/v2-devtools cli schema table --table-id tbl... + ``` + +2. **Look for errors in the output**, especially: + + - `index:*` rules with `status: error` - missing index on foreign key column + - `unique_index:*` rules - missing unique index for one-to-one relations + - `junction_index:*` rules - missing indexes on junction tables (many-to-many) + +3. **Check a specific Link field**: + + ```bash + pnpm --filter @teable/v2-devtools cli schema field --table-id tbl... --field-id fldLinkField + ``` + +4. **Common missing index patterns**: + - Link field (one-to-many): Should have `index` on `fld_{fieldId}__id` column + - Link field (one-to-one): Should have `unique_index` on `fld_{fieldId}__id` column + - Link field (many-to-many): Junction table should have indexes on both FK columns + +## Global Options + +- `-c, --connection <dsn>` - Override DATABASE_URL/PRISMA_DATABASE_URL +- `--help` - Show help message + +## Connection + +Connection is resolved in the following order: + +1. `-c, --connection <dsn>` command line option +2. `PRISMA_DATABASE_URL` environment variable +3. `DATABASE_URL` environment variable +4. Default: `postgresql://teable:teable@127.0.0.1:5432/teable?schema=public` + +## PGlite Mode (Temporary Database) + +DevTools supports **pglite** for file-persisted temporary databases. This is useful for testing table creation and other operations without a real PostgreSQL server. + +### When to Use PGlite + +Use pglite (`pglite://` connection string) when: + +- **Creating temporary tables for testing** - no existing database needed +- **Testing table schema designs** before deploying to production +- **Isolated experiments** that shouldn't affect real data +- **No PostgreSQL server available** (local development without Docker) + +### When NOT to Use PGlite + +Do NOT use pglite when: + +- **User provided a real database URL** (postgresql://) +- **Verifying existing IDs** (tableId, fieldId, recordId, baseId) +- **Querying production/development data** +- **Debugging issues with real tables** + +### PGlite Connection String Format + +``` +pglite://<data-directory-path> +``` + +Examples: + +- `pglite://.pglite-data/session-001` (relative path) +- `pglite:///absolute/path/to/data` (absolute path) + +### Using PGlite + +**Step 1: Create a pglite session** + +First, create a table with a unique pglite connection string. The CLI will automatically: + +- Create the data directory +- Initialize the database schema +- Create a space and base +- Return the generated baseId + +```bash +# Create a new pglite session with a table +pnpm --filter @teable/v2-devtools cli tables create \ + --connection "pglite://.pglite-data/session-$(date +%s)" \ + --base-id "bseXXXXXXXXXXXXX" \ + --name "Test Table" +``` + +> **Note**: For the first command, you need to provide any baseId (it will be created). Check the output for the actual baseId to use in subsequent commands. + +**Step 2: Reuse the same session** + +In the same conversation/session, **remember and reuse** the same connection string and baseId: + +```bash +# Query tables in the same pglite database +pnpm --filter @teable/v2-devtools cli underlying tables \ + --connection "pglite://.pglite-data/session-1234567890" \ + --base-id "bseXXXXXXXXXXXXX" + +# Create more tables in the same base +pnpm --filter @teable/v2-devtools cli tables create \ + --connection "pglite://.pglite-data/session-1234567890" \ + --base-id "bseXXXXXXXXXXXXX" \ + --name "Another Table" +``` + +### Important Notes for AI + +1. **Remember the session**: Store the pglite connection string and baseId for the entire conversation +2. **Data persists in files**: Data is saved to `.pglite-data/` directory (git-ignored) +3. **Isolated sessions**: Each unique path creates a separate database +4. **First-time init**: The first command to a new pglite path will initialize schema + space + base + +### Data Storage + +PGlite data is stored in: + +``` +packages/v2/devtools/.pglite-data/ +├── session-1234567890/ +│ ├── ... (pglite database files) +├── session-0987654321/ +│ └── ... +``` + +This directory is git-ignored and can be safely deleted to clean up test data. + +## Empty Data Handling + +When queries return no data, the CLI provides clear feedback: + +- `code: EMPTY_RESULT` indicates no data was found +- The error message includes hints about what to check + +**If you see EMPTY_RESULT, report to the user** that the requested data was not found in the database. diff --git a/.opencode/skills/teable-v2-package-guide/SKILL.md b/.opencode/skills/teable-v2-package-guide/SKILL.md new file mode 100644 index 0000000000..53f911bf84 --- /dev/null +++ b/.opencode/skills/teable-v2-package-guide/SKILL.md @@ -0,0 +1,209 @@ +# Teable V2 Package Creation Guide + +## When to Use This Skill + +Use this skill when you need to: +- Create a new v2 package +- Configure package.json for v2 packages +- Set up TypeScript configuration +- Understand the v2 package conventions + +## Package Location + +All v2 packages are located in `packages/v2/` directory. + +## Creating a New Package + +### 1. Directory Structure + +``` +packages/v2/<package-name>/ +├── src/ +│ └── index.ts # Main entry point +├── package.json +├── tsconfig.json +├── tsconfig.build.json +└── tsdown.config.ts # Optional, for custom build config +``` + +### 2. Package.json Configuration + +**IMPORTANT: Development-Friendly Exports** + +The key to avoiding rebuilds during development is the `exports` configuration: + +```json +{ + "name": "@teable/v2-<package-name>", + "version": "0.0.0", + "private": true, + "license": "MIT", + "type": "module", + "sideEffects": false, + "main": "dist/index.cjs", + "module": "dist/index.js", + "types": "src/index.ts", + "exports": { + ".": { + "types": "./src/index.ts", + "import": "./src/index.ts", + "module": "./dist/index.js", + "require": "./dist/index.cjs" + } + }, + "files": [ + "dist", + "src" + ], + "scripts": { + "build": "tsdown --tsconfig tsconfig.build.json", + "dev": "tsdown --tsconfig tsconfig.build.json --watch", + "clean": "rimraf ./dist ./coverage ./tsconfig.tsbuildinfo ./tsconfig.build.tsbuildinfo ./.eslintcache", + "lint": "eslint . --ext .ts,.js,.mjs,.cjs,.mts,.cts --cache --cache-location ../../../.cache/eslint/v2-<package-name>.eslintcache", + "typecheck": "tsc --project ./tsconfig.json --noEmit", + "test-unit": "vitest run --silent", + "test-unit-cover": "pnpm test-unit --coverage", + "fix-all-files": "eslint . --ext .ts,.js,.mjs,.cjs,.mts,.cts --fix" + }, + "dependencies": { + // Add your dependencies here + }, + "devDependencies": { + "@teable/v2-tsdown-config": "workspace:*", + "@teable/eslint-config-bases": "workspace:^", + "@types/node": "22.18.0", + "@vitest/coverage-v8": "4.0.16", + "eslint": "8.57.0", + "prettier": "3.2.5", + "rimraf": "5.0.5", + "tsdown": "0.18.1", + "typescript": "5.4.3", + "vite-tsconfig-paths": "4.3.2", + "vitest": "4.0.16" + } +} +``` + +### Key Export Configuration Explained + +```json +"exports": { + ".": { + "types": "./src/index.ts", // TypeScript types from source + "import": "./src/index.ts", // ESM import uses source directly + "module": "./dist/index.js", // Bundlers use built output + "require": "./dist/index.cjs" // CommonJS uses built output + } +} +``` + +**Why this works:** +- `types` and `import` point to `./src/index.ts` - this allows other packages to import TypeScript source directly during development +- `module` and `require` point to `./dist/` - this is used by bundlers and production builds +- No rebuild needed when you change source code - other packages see changes immediately + +### 3. TypeScript Configuration + +**tsconfig.json** (for development/typechecking): +```json +{ + "extends": "../../../tsconfig.base.json", + "compilerOptions": { + "outDir": "./dist", + "rootDir": "./src", + "types": ["node"] + }, + "include": ["src/**/*"], + "exclude": ["node_modules", "dist"] +} +``` + +**tsconfig.build.json** (for building): +```json +{ + "extends": "./tsconfig.json", + "compilerOptions": { + "declaration": true, + "declarationMap": true, + "sourceMap": true + }, + "exclude": ["node_modules", "dist", "**/*.spec.ts", "**/*.test.ts"] +} +``` + +### 4. Build Configuration (Optional) + +**tsdown.config.ts**: +```typescript +import { defineConfig } from '@teable/v2-tsdown-config'; + +export default defineConfig(); +``` + +## Common Patterns + +### Workspace Dependencies + +Use `workspace:*` for internal dependencies: +```json +"dependencies": { + "@teable/v2-core": "workspace:*", + "@teable/v2-di": "workspace:*" +} +``` + +### Dependency Injection + +V2 packages use `@teable/v2-di` for DI: +```typescript +import { injectable, inject } from '@teable/v2-di'; + +@injectable() +export class MyService { + constructor( + @inject(someToken) private readonly dependency: SomeDependency + ) {} +} +``` + +### Error Handling + +Use `neverthrow` for Result types: +```typescript +import { ok, err, Result } from 'neverthrow'; +import { domainError, type DomainError } from '@teable/v2-core'; + +function doSomething(): Result<Data, DomainError> { + if (error) { + return err(domainError.invariant({ message: 'Something went wrong' })); + } + return ok(data); +} +``` + +## Checklist for New Package + +- [ ] Create directory `packages/v2/<package-name>/` +- [ ] Create `src/index.ts` with exports +- [ ] Configure `package.json` with correct exports (pointing to source) +- [ ] Create `tsconfig.json` and `tsconfig.build.json` +- [ ] Add to root `pnpm-workspace.yaml` if needed +- [ ] Run `pnpm install` to link workspace dependencies +- [ ] Verify imports work without building: `pnpm typecheck` + +## Troubleshooting + +### "Cannot find module" errors +1. Check that `exports` in `package.json` points to correct paths +2. Ensure `types` field points to `src/index.ts` +3. Run `pnpm install` to refresh workspace links + +### Changes not reflected in other packages +1. Verify `exports.import` points to `./src/index.ts` (not `./dist/`) +2. Check if the consuming package has cached the old build +3. Restart TypeScript server in your IDE + +### Build errors +1. Ensure `tsconfig.build.json` excludes test files +2. Check that all dependencies are properly declared +3. Run `pnpm clean` before rebuilding diff --git a/.opencode/skills/teable-v2-table-template/SKILL.md b/.opencode/skills/teable-v2-table-template/SKILL.md new file mode 100644 index 0000000000..21c211d29f --- /dev/null +++ b/.opencode/skills/teable-v2-table-template/SKILL.md @@ -0,0 +1,65 @@ +--- +name: teable-v2-table-template +description: Create or update Teable v2 table templates in packages/v2/table-templates (template seeds, fields, records, and exports). +--- + +# Teable v2 Table Template Skill + +Use this skill when you need to add or modify table templates in the v2 codebase. Templates live in `packages/v2/table-templates/src/index.ts`. + +## Quick workflow + +1. Open the template source file: `packages/v2/table-templates/src/index.ts`. +2. Add a seed builder: + - Single table: create a `createXSeed(): SingleTableSeed`. + - Multi table: create a `createXTemplateSeed(): TemplateSeed`. +3. Use helpers for IDs and select options: + - `createFieldId()` for field IDs + - `createTableId()` if you must predefine table IDs + - `createSelectOption()` for single/multi-select choices +4. Create the template definition: + - Single table: `singleTable(key, name, description, createXSeed, defaultRecordCount)` + - Multi table: `createTemplate(key, name, description, createXTemplateSeed, defaultRecordCount)` +5. Export the template and add it to `tableTemplates` array. +6. If you need a field-only helper, export `createXFields = () => createXSeed().fields;`. + +## Notes and conventions + +- Keep templates in `packages/v2/table-templates/src/index.ts` (this package is the single source of truth). +- The `createInput` generator in `TableTemplateDefinition` handles optional record seeding and name prefixing. You only need to supply seed fields and records. +- Prefer `singleTable(...)` unless the template truly needs multiple tables (e.g., CRM with Companies + Contacts). +- Use string keys that are stable and URL-safe (e.g., `content-calendar`, `bug-triage`). +- When seeding records, keep records small and representative; use `normalizeTemplateRecords` behavior to cap or pad. +- New templates should cover as many field types as possible, as long as the business context makes sense (use `allFieldTypesTemplate` for inspiration). + +## Example pattern + +```ts +const createMyTemplateSeed = (): SingleTableSeed => { + const nameFieldId = createFieldId(); + return { + fields: [{ type: 'singleLineText', id: nameFieldId, name: 'Name' }], + records: [{ fields: { [nameFieldId]: 'Example' } }], + }; +}; + +export const myTemplate = singleTable( + 'my-template', + 'My Template', + 'Short description.', + createMyTemplateSeed, + 1 +); + +export const tableTemplates = [ + // ...existing templates, + myTemplate, +] as const; +``` + +## References + +- Source of truth: `packages/v2/table-templates/src/index.ts` +- Package note: `packages/v2/table-templates/ARCHITECTURE.md` +- E2E contract: ensure `creates tables for every template with seeded records` passes in `packages/v2/e2e`. +- Suggested run: `pnpm -C packages/v2/e2e test -- --runInBand --testNamePattern "creates tables for every template with seeded records"` diff --git a/.opencode/skills/teable-v2-test-debug/SKILL.md b/.opencode/skills/teable-v2-test-debug/SKILL.md new file mode 100644 index 0000000000..4282d0de6f --- /dev/null +++ b/.opencode/skills/teable-v2-test-debug/SKILL.md @@ -0,0 +1,73 @@ +--- +name: teable-v2-test-debug +description: Debug Teable v2 tests and failing test cases by prioritizing data reproduction and inspection. Use when asked to debug a test file/spec (unit, integration, or e2e) in packages/v2/*, especially when failures might be caused by table schema, relations, or stored/computed data drift; workflow uses v2-devtools CLI to create a similar table first, then inspects real DB data/relations, and only then reviews code logic. +--- + +# Teable V2 Test Debug + +## Overview + +Follow a data-first debugging workflow for Teable v2 tests. The default order is: reproduce data with devtools, inspect real DB data/relations, then analyze code logic. + +## Workflow: Data-first test debugging + +### 1) Capture failure context + +- Identify the failing test name, file path, and the exact assertion that failed. +- Note the expected vs actual values and any IDs shown in logs (base/table/field/record). +- If the failure is e2e or integration, confirm which base or seed data was used. + +### 2) Reproduce with devtools first (create similar table) + +- Use the v2-devtools CLI to create a minimal table that mirrors the test schema. +- Prefer CLI-based table creation and mock data over hand-written SQL. +- If the schema is complex, build only the fields involved in the failing assertion. + +Common commands: + +```bash +# Get field schema documentation before creating tables +pnpm --filter @teable/v2-devtools cli tables describe-schema + +# Create a table with minimal fields +pnpm --filter @teable/v2-devtools cli tables create --base-id bse... --name "Test Table" --fields '[{"type":"singleLineText","name":"Name","isPrimary":true}]' + +# Generate mock records if data shape matters +pnpm --filter @teable/v2-devtools cli mock generate --table-id tbl... --count 10 --seed 12345 +``` + +If the v2-devtools skill exists, open `/Users/nichenqin/projects/teable/.codex/skills/teable-v2-devtools/SKILL.md` for the full command reference and validation rules. + +### 3) Inspect real DB data and relations + +- Compare application layer vs underlying data first; use stored/computed modes. +- Inspect dependencies and relations to confirm lookup/rollup/formula inputs. +- Validate schema constraints if missing indexes or FK columns are suspected. + +Common commands: + +```bash +# App-layer data (stored/computed) vs underlying +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --mode stored --limit 10 +pnpm --filter @teable/v2-devtools cli records list --table-id tbl... --mode computed --limit 10 +pnpm --filter @teable/v2-devtools cli underlying records --table-id tbl... --limit 10 + +# Inspect a field and its dependencies +pnpm --filter @teable/v2-devtools cli underlying field --field-id fld... +pnpm --filter @teable/v2-devtools cli relations --field-id fld... --direction up --level 2 + +# Check schema integrity if queries are slow or failing +pnpm --filter @teable/v2-devtools cli schema table --table-id tbl... +``` + +### 4) Only then review code logic + +- Map the observed data mismatch back to the handler, visitor, or mapper. +- Verify spec/visitor logic before touching application wiring. +- If the bug is only reproducible with real DB data, prefer adjusting fixtures or seeding rather than altering logic. + +### 5) Decide next action + +- If app-layer vs underlying differs, focus on computed/stored pipeline and mappers. +- If dependencies are wrong, fix field definitions or relation setup first. +- If reproduction fails on minimal data, debug core logic with a tight fixture. diff --git a/.prettierignore b/.prettierignore index c69d52b86f..4cdf73f11c 100644 --- a/.prettierignore +++ b/.prettierignore @@ -7,3 +7,4 @@ pnpm-lock.yaml **/build **/.tmp **/.cache +apps/playground/src/routeTree.gen.ts diff --git a/.vscode/settings.json b/.vscode/settings.json index 2d00bb34a4..19ac4015e3 100644 --- a/.vscode/settings.json +++ b/.vscode/settings.json @@ -56,7 +56,15 @@ }, { "pattern": "./packages/*/" + }, + { + "pattern": "./packages/v2/*/" } ], - "vitest.maximumConfigs": 10 + "vitest.maximumConfigs": 50, + "vitest.nodeEnv": { + "DOCKER_HOST": "unix:///Users/nichenqin/.colima/default/docker.sock", + "TESTCONTAINERS_DOCKER_SOCKET_OVERRIDE": "/var/run/docker.sock", + "TESTCONTAINERS_HOST_OVERRIDE": "127.0.0.1" + } } \ No newline at end of file diff --git a/agents.md b/agents.md new file mode 100644 index 0000000000..8883f53281 --- /dev/null +++ b/agents.md @@ -0,0 +1,448 @@ +# Teable v2 (DDD) agent guide + +This repo is introducing a new `packages/v2/*` architecture. Keep `v2` strict and boring: **domain first**, **interfaces first**, **Result-only errors**, **specifications for querying**, **builders/factories for creation**. + +## Git hygiene + +- Ignore git changes that you did not make by default; never revert unknown/unrelated modifications unless explicitly instructed. + +## v2 layering (strict) + +`packages/v2/core` is the domain/core. + +- **Allowed dependencies (inside `v2/core`)** + - `neverthrow` for `Result` + - `zod` for validation (`safeParse` only) + - `nanoid` for ID generation + - `ts-pattern` for match pattern + - `@teable/formula` for formula parsing utilities + - `@teable/v2-di` is allowed only in `src/commands/**` (application wiring), not in domain + - Pure TS/JS standard library +- **Forbidden inside `v2/core`** + - No NestJS, Prisma, HTTP, queues, DB clients, file system, env access + - No direct infrastructure code + - No dependency on v1 core (`packages/core` / `@teable/core`) + - No direct `antlr4ts` imports (use `@teable/formula` re-exports) + - No `throw` / exceptions for control flow + +Future adapters live in their own workspace packages under `packages/v2/*` and depend on `@teable/v2-core` (never the other way around). + +## v2 API contracts (HTTP) + +For HTTP-ish integrations, keep framework-independent contracts/mappers in `packages/v2/contract-http`: + +- Define API paths (e.g. `/tables`) as constants. +- Use action-style paths with camelCase action names (e.g. `/tables/create`, `/tables/get`, `/tables/rename`); avoid RESTful nested resources like `/bases/{baseId}/tables/{tableId}`. +- Re-export command input schemas (zod) for route-level validation if needed. +- Keep DTO types + domain-to-DTO mappers here. +- Router packages (e.g. `@teable/v2-contract-http-express`, `@teable/v2-contract-http-fastify`) should be thin adapters that only: + - parse JSON/body + - create a container + - resolve handlers + - call the endpoint executor/mappers from `@teable/v2-contract-http` +- OpenAPI is generated from the ts-rest contract via `@teable/v2-contract-http-openapi`. + +## UI components (frontend) + +- In app UIs (e.g. `apps/playground`), use shadcn wrappers from `apps/playground/src/components/ui/*` (or `@teable/ui-lib`) instead of importing Radix primitives directly. +- If a shadcn wrapper is missing, add it under `apps/playground/src/components/ui` before using the primitive. + +## Dependency injection (DI) + +- Do not import `tsyringe` / `reflect-metadata` directly anywhere; use `@teable/v2-di`. +- Do not use DI inside `v2/core/src/domain/**`; DI is only for application wiring (e.g. `v2/core/src/commands/**`). +- Prefer constructor injection with explicit tokens for ports (interfaces). +- Provide environment-level composition roots as separate packages (e.g. `@teable/v2-container-node`, `@teable/v2-container-browser`) that register all port implementations. + +## Tracing (DDD) + +- Tracing is an application/port concern; never use tracing decorators or tracer interfaces in `packages/v2/core/src/domain/**`. +- Use `@TraceSpan(...)` only in `packages/v2/core/src/commands/**` (and other app handlers) to wrap spans. +- Real tracing happens in adapters by supplying an `ITracer` implementation; core defaults should be no-op. +- Keep span attributes minimal and avoid PII unless explicitly required. + +## Command/Event mediator (bus) + +- Command handlers are registered via `@CommandHandler(Command)` and invoked through `ICommandBus.execute(...)`. +- Event handlers are registered via `@EventHandler(Event)` and invoked through `IEventBus.publish(...)`/`publishMany(...)`. +- Do not resolve handlers directly from the container in business code or adapters; always go through the bus. +- `ICommandBus`/`IEventBus` are ports; default in-memory implementations live in `v2/core/src/ports/memory` and can be swapped by adapters (RxJS/Kafka/etc). +- Containers must register `v2CoreTokens.commandBus` and `v2CoreTokens.eventBus`. + +## Query mediator (bus) + +- Query handlers are registered via `@QueryHandler(Query)` and invoked through `IQueryBus.execute(...)`. +- Do not resolve query handlers directly from the container in business code or adapters; always go through the bus. +- `IQueryBus` is a port; default in-memory implementations live in `v2/core/src/ports/memory`. +- Containers must register `v2CoreTokens.queryBus`. + +## Unit of work (transactions) + +- Cross-repository workflows in commands must be wrapped in `IUnitOfWork.withTransaction(...)`. +- Repositories should reuse `IExecutionContext.transaction` when present (do not start nested transactions). +- Postgres implementation lives in `@teable/v2-adapter-db-postgres-pg` (`PostgresUnitOfWork`, `PostgresUnitOfWorkTransaction`); register it in containers. +- Publish domain events only after transactional work succeeds. + +## Build tooling (v2) + +- v2 packages build with `tsdown` (not `tsc` emit). `tsc` is used only for `typecheck` (`--noEmit`). +- Each v2 package has a local `tsdown.config.ts` that extends the shared base config from `@teable/v2-tsdown-config`. +- Outputs are written to `dist/` (ESM `.js` + `.d.ts`), and workspace deps (`@teable/v2-*`) are kept external (no bundling across packages). + +## Source visibility (v2 packages) + +**All v2 packages must support source visibility** to allow consumers to reference TypeScript sources without building `dist/` outputs. This is required for development workflows, testing, and tools like Vitest/Vite that can consume TypeScript directly. + +**Required configuration:** + +- In `package.json`: + - Set `types` field to `"src/index.ts"` (not `"dist/index.d.ts"`) + - Set `exports["."].types` to `"./src/index.ts"` (not `"./dist/index.d.ts"`) + - Set `exports["."].import` to `"./src/index.ts"` (not `"./dist/index.js"`) to allow Vite/Vitest to use source files directly + - Keep `exports["."].require` pointing to `"./dist/index.cjs"` for CommonJS compatibility + - Include `"src"` in the `files` array (in addition to `"dist"`) +- In `tsconfig.json`: + - Map workspace dependencies to their `src` paths in `compilerOptions.paths` (e.g. `"@teable/v2-core": ["../core/src"]`) + - Include those source paths in the `include` array + +**Example `package.json` configuration:** +```json +{ + "types": "src/index.ts", + "exports": { + ".": { + "types": "./src/index.ts", + "import": "./src/index.ts", + "require": "./dist/index.cjs" + } + }, + "files": ["dist", "src"] +} +``` + +**Note:** Since v2 packages are workspace-only (`"private": true`) and not published to npm, pointing `import` to source files is safe. Vite/Vitest can process TypeScript files directly, enabling faster development cycles without requiring `dist/` to be built first. + +## Error handling (non-negotiable) + +- **Never throw in `v2/core`.** +- Use `neverthrow` `Result` everywhere. +- If something isn’t implemented yet: return `err('Not implemented')` (or a typed error string). +- Use `zod.safeParse(...)` and convert failures into `err(...)` (no `parse()`). + +## Type system rules (non-negotiable) + +Inside `v2/core` domain APIs: + +- Do not use raw primitives (`string`, `number`, `boolean`) as domain parameters/returns for domain concepts. +- Use **Value Objects** / **branded types** for IDs, names, and key concepts. +- IDs are **nominal** (not structurally compatible): `FieldId` must not be assignable to `ViewId`. +- Raw primitives are allowed only at the **outer boundary** (DTOs) and must be immediately validated and converted via factories/builders. + +Practical exceptions that are required by the architecture: + +- `neverthrow` error side uses strings (e.g. `Result<T, string>`). +- The Specification interface requires `isSatisfiedBy(...): boolean`. +- Value Objects may expose `toString()` / `toDate()` / `toNumber()` for adapter/serialization boundaries (avoid using these in domain logic). +- Rehydration-only Value Objects (e.g. `DbTableName`, `DbFieldName`) must extend `RehydratedValueObject`; create empty placeholders in domain, set real values only via repository rehydrate, and return `err(...)` when accessed before rehydrate. + +## Builders/factories (non-negotiable) + +- Do not `new Table()` / `new Field()` / `new View()` outside factories/builders. +- Table creation must go through the **TableBuilder** (the public creation API). +- Value Objects are created via static factory methods that validate with `zod`. +- Builder configuration methods should be fluent (return the builder) and must not throw; validation/creation errors are surfaced via `build(): Result<...>`. + +## Specification pattern (required) + +Repositories query via specifications, not ad-hoc filters. + +- Implement `ISpecification` exactly as defined in `v2/core`. +- Provide composable specs (`AndSpec`, `OrSpec`, `NotSpec`). +- `accept(visitor)` is wired for future translation into persistence queries. +- Build specs via entity spec builders (e.g. `Table.specs(baseId)`); do not `new` spec classes directly. +- Each spec targets a single attribute (e.g. `TableByNameSpec` only checks name). `BaseId` is its own spec and is composed via `and/or/not`. +- `and` and `or` must be separated by nesting (use `andGroup`/`orGroup`); never mix them at the same level. BaseId specs are auto-included by the builder unless explicitly disabled. +- Spec visitors rely on `visit(spec)` + type narrowing inside the visitor; avoid per-spec visitor interfaces or `isWith*` guards. + +## Visitor pattern (preferred) + +- For multi-type logic that already has a visitor (fields, specs, etc.), prefer a dedicated visitor file over switch/if chains. +- Keep type-to-value mappings in visitors so new types require explicit visitor updates. + +## Condition/Filter handling (non-negotiable) + +**Record conditions are a core domain concept.** All condition/filter logic MUST use the specification + visitor pattern. Never directly parse or interpret condition objects with switch/if/match chains. + +### Required pattern + +1. **Domain**: Conditions are modeled as `RecordConditionSpec` (specification pattern) +2. **Creation**: Use `FieldConditionSpecBuilder` or `RecordConditionSpecBuilder` to create specs +3. **Translation**: Use `ITableRecordConditionSpecVisitor` implementations to translate specs (e.g. to SQL WHERE clauses) + +```typescript +// ✅ CORRECT: Use spec + visitor pattern +const spec = yield* condition.toRecordConditionSpec(table); +const visitor = new TableRecordConditionWhereVisitor(); +const whereClause = yield* spec.accept(visitor); + +// ❌ WRONG: Direct parsing with match/switch +for (const item of condition.filterItems()) { + match(item.operator) + .with('is', () => sql`${col} = ${val}`) + .with('isNot', () => sql`${col} != ${val}`) + // ... duplicates visitor logic +} +``` + +### Why this matters + +- **Single source of truth**: All operator logic lives in one visitor +- **Type safety**: Adding new operators requires updating the visitor interface (compile-time errors) +- **Consistency**: Same behavior across all condition consumers (views, computed fields, API filters) +- **Testability**: Visitor behavior is tested once with full coverage + +### Key files + +- `v2/core/src/domain/table/records/specs/RecordConditionSpec.ts` - Base spec classes +- `v2/core/src/domain/table/records/specs/ITableRecordConditionSpecVisitor.ts` - Visitor interface +- `v2/core/src/domain/table/records/specs/FieldConditionSpecBuilder.ts` - Creates specs from field + operator + value +- Adapter visitors (e.g. `TableRecordConditionWhereVisitor`) - Translate to SQL + +## Folder conventions (recommended) + +Inside `packages/v2/core/src`: + +- `domain/` — aggregates, entities, value objects, domain events +- `specification/` — spec framework + visitors +- `ports/` — interfaces/ports (repositories, event bus/publisher, mappers) +- `commands/` — commands + handlers (application use-cases over domain) +- `queries/` — queries + handlers (application read use-cases over domain) + +## Architecture docs (required) + +- When adding or changing a significant folder/module, create or update its `ARCHITECTURE.md`. +- If subfolders change, also update the parent `ARCHITECTURE.md` to keep the folder map accurate. + +## Naming conventions + +- Value Objects: `*Id`, `*Name` (e.g. `TableId`, `FieldName`) +- Commands: `*Command` +- Queries: `*Query` +- Handlers/use-cases: `*Handler` +- Domain events: past tense (e.g. `TableCreated`) +- Specifications: `*Spec` (e.g. `TableByIdSpec`) + +## Adding a new field type + +1. Add a new field subtype under `domain/table/fields/types/`. +2. Add any new value objects/config under the same subtree. +3. Extend the table builder with a new field child-builder: + - add `TableFieldBuilder.<newType>()` (in `domain/table/TableBuilder.ts`) + - implement a `<NewType>FieldBuilder` with fluent `with...()` methods and `done(): TableBuilder` +4. Update `IFieldVisitor` (and any visitors like `NoopFieldVisitor`) to support the new field subtype. +5. Update `CreateTableCommand` input validation to allow the new type. + +## Adding a repository adapter later + +1. Keep the port in `v2/core/src/ports/TableRepository.ts`. +2. Implement the adapter in a separate package (e.g. `packages/v2/adapter-repository-postgres`). +3. Translate Specifications via a visitor (start with the stub visitor in `v2/core`). +4. Map persistence DTOs <-> domain using mapper interfaces from `v2/core/src/ports/mappers`. + +## Testing expectations (minimal) + +## Type-exhaustive testing with `it.each` (required) + +When testing behavior that varies by type (e.g. field types, cell value types, view types), use `it.each` with a **type-safe exhaustive matrix**. This ensures: + +1. All current types are tested +2. TypeScript errors when new types are added but not covered in tests +3. Clear per-type test output + +### Pattern + +```typescript +// 1. Define the type literal from the source of truth +type FieldTypeLiteral = (typeof fieldTypeValues)[number]; + +// 2. Define test case interface +interface InnerFieldTestCase { + type: FieldTypeLiteral; + factory: (id: FieldId, name: FieldName) => Result<Field, DomainError>; + expectedCellValueType: 'string' | 'number' | 'boolean' | 'dateTime'; +} + +// 3. Create exhaustive map - TypeScript errors if any type is missing +const createTestCases = (): Record<FieldTypeLiteral, InnerFieldTestCase> => ({ + singleLineText: { type: 'singleLineText', factory: ..., expectedCellValueType: 'string' }, + number: { type: 'number', factory: ..., expectedCellValueType: 'number' }, + // ... ALL other types must be listed +}); + +// 4. Compile-time exhaustiveness check +const _exhaustiveCheck: Record<FieldTypeLiteral, InnerFieldTestCase> = createTestCases(); +void _exhaustiveCheck; + +// 5. Use it.each for matrix testing +describe('inner field types matrix', () => { + const testCases = Object.values(createTestCases()); + + it.each(testCases)( + 'creates lookup field with $type inner field', + ({ type, factory, expectedCellValueType }) => { + // Test implementation + } + ); +}); +``` + +### When to use + +- Testing field type-specific behavior (e.g. LookupField with different inner field types) +- Testing view type-specific rendering/behavior +- Testing cell value type conversions +- Any scenario where behavior varies across a finite set of types + +### Benefits + +- **Type safety**: Adding a new field type to `fieldTypeValues` will cause TypeScript to error in tests until the new type is added to the test matrix +- **Completeness**: Every type variant is explicitly tested +- **Maintainability**: Clear structure for adding new types +- **Readability**: Test output shows which specific type failed + +See `LookupField.spec.ts` for a complete example. + +## Testing strategy (domain → e2e) + +v2 uses a layered test strategy. The same behavior should usually be asserted **once** at the most appropriate layer (avoid duplicating identical assertions across many layers). + +### 1) Domain unit tests (`v2/core` domain) + +**Where** + +- `packages/v2/core/src/domain/**/*.spec.ts` + +**Focus** + +- Value Object validation (`.create(...)` + `zod.safeParse`) +- Aggregate/entity behavior and invariants +- Builder behavior (`Table.builder()...build()`), including default view behavior +- Domain event creation/recording (e.g. `TableCreated`) +- Specification correctness for in-memory satisfaction (`isSatisfiedBy`) + +**Must NOT do** + +- No DI/container, no repositories/ports, no DB, no HTTP, no filesystem, no timeouts +- No infrastructure DTOs (HTTP/persistence) and no framework code + +**What to assert** + +- `Result` is `ok/err` (never exceptions) +- Invariants on returned domain objects (counts, names, IDs are nominal types, etc.) +- Domain events are produced and contain essential info (do not snapshot the entire object) + +### 2) Application/use-case tests (`v2/core` commands + DI) + +**Where** + +- Prefer `packages/v2/test-node/src/**/*.spec.ts` (a dedicated test package) + +**Focus** + +- Handler orchestration (build aggregate, call repository, publish events) +- Correct `Result` behavior for ok/err paths +- Command-level validation (invalid input → `err(...)`) +- Correct wiring via DI (handlers resolved from container; do not `new Handler(...)` in tests) + +**Allowed** + +- Fakes/in-memory ports (recommended) OR the node-test container (pglite-backed) when you want a slightly higher-confidence integration without HTTP. + +**What to assert** + +- Handler returns expected status (`ok/err`) and minimal returned data (e.g. created table name) +- Domain events were published (e.g. contains `TableCreated`) +- Repository side-effect happened (either “save called” via fake, or “can be queried back” via `findOne(spec)`) + +### 3) Adapter integration tests (persistence/infra adapters) + +**Where** + +- `packages/v2/adapter-*/src/**/*.spec.ts` + +**Focus** + +- Spec → query translation via Spec Visitors (no ad-hoc where parsing) +- Mapper correctness (persistence DTO ↔︎ domain) +- Repository behavior against a real DB driver + +**Allowed** + +- `pglite` for tests (fast, hermetic) + +**What to assert** + +- Round-trips: save → query by spec → domain object matches essentials +- Visitor builds the expected query constraints (at least for supported specs) + +### 4) Contract tests (`contract-http`) + +**Where** + +- `packages/v2/contract-http/src/**/*.spec.ts` (optional but recommended for mapping-heavy endpoints) + +**Focus** + +- DTO mappers and endpoint executors +- Contract response shapes and status codes + +**What to assert** + +- `execute*Endpoint(...)` returns only the status codes declared in the contract +- Response DTO structure matches schema intent (avoid deep snapshots) + +### 5) Router adapter tests (Express/Fastify) + +**Where** + +- `packages/v2/contract-http-express/src/**/*.spec.ts` +- `packages/v2/contract-http-fastify/src/**/*.spec.ts` + +**Focus** + +- Framework glue: request parsing, ts-rest integration, error mapping +- Container creation is correct and lazy (don’t eagerly connect to PG when a custom container is injected) + +**What to assert** + +- Valid request → expected status/result +- Invalid request → 400 (schema validation) + +### 6) E2E tests (`v2/e2e`) + +**Where** + +- `packages/v2/e2e/src/**/*.e2e.spec.ts` + +**Focus** + +- “Over-the-wire” HTTP behavior using the generated ts-rest client +- Cross-package integration: router + contract + container + repository adapter + +**Allowed** + +- Start an in-process server on an ephemeral port (no fixed ports) +- Use the node-test container with `pglite` and ensure proper cleanup (`dispose`) + +**What to assert** + +- HTTP status codes and response DTOs (validate shape, not internal domain objects) +- Minimal business outcome (e.g. table created, includes `TableCreated` event) + +## Quality checks (required for big changes) + +- After substantial changes, run `pnpm -C packages/v2/<pkg> typecheck` and `pnpm -C packages/v2/<pkg> lint` for each affected package and report any failures. +- Resolve any TypeScript errors before marking the task complete. +- Run eslint with auto-fix (`pnpm -C packages/v2/<pkg> lint -- --fix` or `pnpm -C packages/v2/<pkg> fix-all-files`) and format the touched files before completion. +- Every task completion must include formatting/fix steps for touched v2 packages (eslint auto-fix or equivalent). +- When working with libraries, check the docs with Ref mcp diff --git a/apps/nestjs-backend/.eslintrc.js b/apps/nestjs-backend/.eslintrc.js index 0fc6a80460..9b9de3453e 100644 --- a/apps/nestjs-backend/.eslintrc.js +++ b/apps/nestjs-backend/.eslintrc.js @@ -34,5 +34,13 @@ module.exports = { '@typescript-eslint/naming-convention': 'off', }, }, + { + // Disable consistent-type-imports for files with decorators (NestJS controllers/services) + // See: https://typescript-eslint.io/blog/changes-to-consistent-type-imports-with-decorators + files: ['src/**/*.controller.ts'], + rules: { + '@typescript-eslint/consistent-type-imports': 'off', + }, + }, ], }; diff --git a/apps/nestjs-backend/package.json b/apps/nestjs-backend/package.json index 8c9fa87940..f6b92de374 100644 --- a/apps/nestjs-backend/package.json +++ b/apps/nestjs-backend/package.json @@ -151,6 +151,7 @@ "@nestjs/terminus": "10.2.3", "@nestjs/websockets": "10.3.5", "@openrouter/ai-sdk-provider": "1.2.3", + "@orpc/nest": "1.13.0", "@opentelemetry/api": "1.9.0", "@opentelemetry/exporter-logs-otlp-http": "0.201.1", "@opentelemetry/exporter-metrics-otlp-http": "0.201.1", @@ -158,6 +159,7 @@ "@opentelemetry/instrumentation-express": "0.50.0", "@opentelemetry/instrumentation-http": "0.201.1", "@opentelemetry/instrumentation-nestjs-core": "0.49.0", + "@opentelemetry/instrumentation-pg": "0.49.0", "@opentelemetry/instrumentation-pino": "0.49.0", "@opentelemetry/resources": "2.0.1", "@opentelemetry/sdk-node": "0.201.1", @@ -172,7 +174,15 @@ "@teable/core": "workspace:^", "@teable/db-main-prisma": "workspace:^", "@teable/openapi": "workspace:^", - "@an-epiphany/websocket-json-stream": "1.2.0", + "@teable/v2-container-node": "workspace:*", + "@teable/v2-contract-http": "workspace:*", + "@teable/v2-contract-http-openapi": "workspace:*", + "@teable/v2-contract-http-implementation": "workspace:*", + "@teable/v2-core": "workspace:*", + "@teable/v2-adapter-db-postgres-pg": "workspace:*", + "@teable/v2-adapter-realtime-sharedb": "workspace:*", + "@teable/v2-di": "workspace:*", + "@teamwork/websocket-json-stream": "2.0.0", "@valibot/to-json-schema": "1.3.0", "ai": "6.0.14", "ajv": "8.12.0", @@ -232,6 +242,8 @@ "pg": "8.11.5", "pino-http": "10.5.0", "pino-pretty": "11.0.0", + "react": "18.3.1", + "react-dom": "18.3.1", "redlock": "5.0.0-beta.2", "reflect-metadata": "0.2.1", "rxjs": "7.8.1", diff --git a/apps/nestjs-backend/src/app.module.ts b/apps/nestjs-backend/src/app.module.ts index 9c55197d65..7ca03c8059 100644 --- a/apps/nestjs-backend/src/app.module.ts +++ b/apps/nestjs-backend/src/app.module.ts @@ -45,6 +45,7 @@ import { TemplateOpenApiModule } from './features/template/template-open-api.mod import { TrashModule } from './features/trash/trash.module'; import { UndoRedoModule } from './features/undo-redo/open-api/undo-redo.module'; import { UserModule } from './features/user/user.module'; +import { V2Module } from './features/v2/v2.module'; import { GlobalModule } from './global/global.module'; import { InitBootstrapProvider } from './global/init-bootstrap.provider'; import { LoggerModule } from './logger/logger.module'; @@ -96,6 +97,7 @@ export const appModules = { PluginChartModule, ObservabilityModule, BuiltinAssetsInitModule, + V2Module, ], providers: [InitBootstrapProvider], }; diff --git a/apps/nestjs-backend/src/features/record/open-api/record-open-api-v2.service.ts b/apps/nestjs-backend/src/features/record/open-api/record-open-api-v2.service.ts new file mode 100644 index 0000000000..4ba3873092 --- /dev/null +++ b/apps/nestjs-backend/src/features/record/open-api/record-open-api-v2.service.ts @@ -0,0 +1,88 @@ +import { Injectable, HttpException, HttpStatus } from '@nestjs/common'; +import type { IUpdateRecordRo, IRecord, ICreateRecordsRo } from '@teable/openapi'; +import { + executeCreateRecordsEndpoint, + executeUpdateRecordEndpoint, +} from '@teable/v2-contract-http-implementation/handlers'; +import { v2CoreTokens, ActorId } from '@teable/v2-core'; +import type { ICommandBus } from '@teable/v2-core'; +import { ClsService } from 'nestjs-cls'; +import type { IClsStore } from '../../../types/cls'; +import { V2ContainerService } from '../../v2/v2-container.service'; + +@Injectable() +export class RecordOpenApiV2Service { + constructor( + private readonly v2ContainerService: V2ContainerService, + private readonly cls: ClsService<IClsStore> + ) {} + + async updateRecord( + tableId: string, + recordId: string, + updateRecordRo: IUpdateRecordRo + ): Promise<IRecord> { + const container = await this.v2ContainerService.getContainer(); + const commandBus = container.resolve<ICommandBus>(v2CoreTokens.commandBus); + + const userId = this.cls.get('user.id'); + const actorIdResult = ActorId.create(userId); + if (actorIdResult.isErr()) { + throw new HttpException(actorIdResult.error.message, HttpStatus.INTERNAL_SERVER_ERROR); + } + + // Convert v1 input format to v2 format + // v1: { record: { fields: { fieldKey: value } } } + // v2: { tableId, recordId, fields: { fieldId: value } } + const v2Input = { + tableId, + recordId, + fields: updateRecordRo.record.fields, + }; + + const result = await executeUpdateRecordEndpoint( + { actorId: actorIdResult.value }, + v2Input, + commandBus + ); + if (result.status === 200 && result.body.ok) { + // Convert v2 output to v1 format + const v2Record = result.body.data.record; + return { + id: v2Record.id, + fields: v2Record.fields, + } as IRecord; + } + + if (!result.body.ok) { + throw new HttpException(result.body.error.message, result.status); + } + + throw new HttpException('Internal server error', HttpStatus.INTERNAL_SERVER_ERROR); + } + + async createRecords(tableId: string, createRecordsRo: ICreateRecordsRo) { + const container = await this.v2ContainerService.getContainer(); + const commandBus = container.resolve<ICommandBus>(v2CoreTokens.commandBus); + const userId = this.cls.get('user.id'); + const actorIdResult = ActorId.create(userId); + if (actorIdResult.isErr()) { + throw new HttpException(actorIdResult.error.message, HttpStatus.INTERNAL_SERVER_ERROR); + } + const result = await executeCreateRecordsEndpoint( + { actorId: actorIdResult.value }, + { tableId, records: createRecordsRo.records }, + commandBus + ); + + if (result.status === 201 && result.body.ok) { + return result.body.data; + } + + if (!result.body.ok) { + throw new HttpException(result.body.error.message, result.status); + } + + throw new HttpException('Internal server error', HttpStatus.INTERNAL_SERVER_ERROR); + } +} diff --git a/apps/nestjs-backend/src/features/record/open-api/record-open-api.controller.ts b/apps/nestjs-backend/src/features/record/open-api/record-open-api.controller.ts index 6441a8d7b6..6082d2a75c 100644 --- a/apps/nestjs-backend/src/features/record/open-api/record-open-api.controller.ts +++ b/apps/nestjs-backend/src/features/record/open-api/record-open-api.controller.ts @@ -15,6 +15,17 @@ import { } from '@nestjs/common'; import { FileInterceptor } from '@nestjs/platform-express'; import { PrismaService } from '@teable/db-main-prisma'; +import { + createRecordsRoSchema, + getRecordQuerySchema, + getRecordsRoSchema, + updateRecordRoSchema, + deleteRecordsQuerySchema, + getRecordHistoryQuerySchema, + updateRecordsRoSchema, + recordInsertOrderRoSchema, + recordGetCollaboratorsRoSchema, +} from '@teable/openapi'; import type { IAutoFillCellVo, IButtonClickVo, @@ -23,26 +34,15 @@ import type { IRecordGetCollaboratorsVo, IRecordStatusVo, IRecordsVo, -} from '@teable/openapi'; -import { - createRecordsRoSchema, - getRecordQuerySchema, - getRecordsRoSchema, - IGetRecordsRo, ICreateRecordsRo, - IGetRecordQuery, - IUpdateRecordRo, - updateRecordRoSchema, - deleteRecordsQuerySchema, IDeleteRecordsQuery, - getRecordHistoryQuerySchema, + IGetRecordQuery, IGetRecordHistoryQuery, - updateRecordsRoSchema, - IUpdateRecordsRo, - recordInsertOrderRoSchema, - IRecordInsertOrderRo, - recordGetCollaboratorsRoSchema, + IGetRecordsRo, IRecordGetCollaboratorsRo, + IRecordInsertOrderRo, + IUpdateRecordRo, + IUpdateRecordsRo, } from '@teable/openapi'; import { ClsService } from 'nestjs-cls'; import { EmitControllerEvent } from '../../../event-emitter/decorators/emit-controller-event.decorator'; @@ -56,6 +56,7 @@ import { AllowAnonymous } from '../../auth/decorators/allow-anonymous.decorator' import { Permissions } from '../../auth/decorators/permissions.decorator'; import { RecordService } from '../record.service'; import { FieldKeyPipe } from './field-key.pipe'; +import { RecordOpenApiV2Service } from './record-open-api-v2.service'; import { RecordOpenApiService } from './record-open-api.service'; import { TqlPipe } from './tql.pipe'; @@ -67,7 +68,8 @@ export class RecordOpenApiController { private readonly recordOpenApiService: RecordOpenApiService, private readonly performanceCacheService: PerformanceCacheService, private readonly prismaService: PrismaService, - private readonly cls: ClsService<IClsStore> + private readonly cls: ClsService<IClsStore>, + private readonly recordOpenApiV2Service: RecordOpenApiV2Service ) {} @Permissions('record|update') @@ -124,8 +126,14 @@ export class RecordOpenApiController { @Param('recordId') recordId: string, @Body(new ZodValidationPipe(updateRecordRoSchema)) updateRecordRo: IUpdateRecordRo, @Headers('x-window-id') windowId?: string, - @Headers('x-ai-internal') isAiInternal?: string + @Headers('x-ai-internal') isAiInternal?: string, + @Headers('x-use-v2') useV2?: string ): Promise<IRecord> { + // Use v2 logic when x-use-v2 header is set to 'true' or '1' + if (useV2 === 'true' || useV2 === '1') { + return this.recordOpenApiV2Service.updateRecord(tableId, recordId, updateRecordRo); + } + return await this.recordOpenApiService.updateRecord( tableId, recordId, @@ -178,8 +186,13 @@ export class RecordOpenApiController { async createRecords( @Param('tableId') tableId: string, @Body(new ZodValidationPipe(createRecordsRoSchema)) createRecordsRo: ICreateRecordsRo, - @Headers('x-ai-internal') isAiInternal?: string + @Headers('x-ai-internal') isAiInternal?: string, + @Headers('x-use-v2') useV2?: string ): Promise<ICreateRecordsVo> { + if (useV2 === 'true' || useV2 === '1') { + return await this.recordOpenApiV2Service.createRecords(tableId, createRecordsRo); + } + return await this.recordOpenApiService.multipleCreateRecords( tableId, createRecordsRo, diff --git a/apps/nestjs-backend/src/features/record/open-api/record-open-api.module.ts b/apps/nestjs-backend/src/features/record/open-api/record-open-api.module.ts index 1bf591ca36..96d5a5cbaf 100644 --- a/apps/nestjs-backend/src/features/record/open-api/record-open-api.module.ts +++ b/apps/nestjs-backend/src/features/record/open-api/record-open-api.module.ts @@ -5,10 +5,12 @@ import { CalculationModule } from '../../calculation/calculation.module'; import { CollaboratorModule } from '../../collaborator/collaborator.module'; import { FieldCalculateModule } from '../../field/field-calculate/field-calculate.module'; import { TableDomainQueryModule } from '../../table-domain'; +import { V2Module } from '../../v2/v2.module'; import { ViewOpenApiModule } from '../../view/open-api/view-open-api.module'; import { ViewModule } from '../../view/view.module'; import { RecordModifyModule } from '../record-modify/record-modify.module'; import { RecordModule } from '../record.module'; +import { RecordOpenApiV2Service } from './record-open-api-v2.service'; import { RecordOpenApiController } from './record-open-api.controller'; import { RecordOpenApiService } from './record-open-api.service'; @@ -24,9 +26,10 @@ import { RecordOpenApiService } from './record-open-api.service'; ViewModule, ViewOpenApiModule, TableDomainQueryModule, + V2Module, ], controllers: [RecordOpenApiController], - providers: [RecordOpenApiService], + providers: [RecordOpenApiService, RecordOpenApiV2Service], exports: [RecordOpenApiService], }) export class RecordOpenApiModule {} diff --git a/apps/nestjs-backend/src/features/record/query-builder/sql-conversion.visitor.ts b/apps/nestjs-backend/src/features/record/query-builder/sql-conversion.visitor.ts index 186303a68f..28d1deab58 100644 --- a/apps/nestjs-backend/src/features/record/query-builder/sql-conversion.visitor.ts +++ b/apps/nestjs-backend/src/features/record/query-builder/sql-conversion.visitor.ts @@ -56,7 +56,7 @@ import type { IDatetimeFormatting, } from '@teable/core'; import type { ITeableToDbFunctionConverter } from '@teable/core/src/formula/function-convertor.interface'; -import type { RootContext, UnaryOpContext } from '@teable/core/src/formula/parser/Formula'; +import type { RootContext, UnaryOpContext } from '@teable/formula'; import type { Knex } from 'knex'; import { match } from 'ts-pattern'; import type { IFieldSelectName } from './field-select.type'; diff --git a/apps/nestjs-backend/src/features/v2/v2-command-bus-tracing.middleware.ts b/apps/nestjs-backend/src/features/v2/v2-command-bus-tracing.middleware.ts new file mode 100644 index 0000000000..1520b03f7b --- /dev/null +++ b/apps/nestjs-backend/src/features/v2/v2-command-bus-tracing.middleware.ts @@ -0,0 +1,47 @@ +import type { + CommandBusNext, + ICommandBusMiddleware, + IExecutionContext, +} from '@teable/v2-core' with { 'resolution-mode': 'import' }; + +const describeError = (error: unknown): string => { + if (error instanceof Error) return error.message || error.name; + if (typeof error === 'string') return error; + try { + return JSON.stringify(error) ?? String(error); + } catch { + return String(error); + } +}; + +export class CommandBusTracingMiddleware implements ICommandBusMiddleware { + async handle<TCommand, TResult>( + context: IExecutionContext, + command: TCommand, + next: CommandBusNext<TCommand, TResult> + ) { + const tracer = context.tracer; + if (!tracer) { + return next(context, command); + } + + const commandName = + (command as { constructor?: { name?: string } }).constructor?.name ?? 'UnknownCommand'; + const span = tracer.startSpan(`teable.command.${commandName}`, { + command: commandName, + }); + + try { + const result = await next(context, command); + if (result.isErr()) { + span.recordError(result.error.message ?? 'Unknown error'); + } + return result; + } catch (error) { + span.recordError(describeError(error)); + throw error; + } finally { + span.end(); + } + } +} diff --git a/apps/nestjs-backend/src/features/v2/v2-container.service.ts b/apps/nestjs-backend/src/features/v2/v2-container.service.ts new file mode 100644 index 0000000000..1e36ec6f81 --- /dev/null +++ b/apps/nestjs-backend/src/features/v2/v2-container.service.ts @@ -0,0 +1,59 @@ +import type { OnModuleDestroy } from '@nestjs/common'; +import { Injectable } from '@nestjs/common'; +import { ConfigService } from '@nestjs/config'; +import { v2PostgresDbTokens } from '@teable/v2-adapter-db-postgres-pg'; +import { + ShareDbPubSubPublisher, + registerV2ShareDbRealtime, +} from '@teable/v2-adapter-realtime-sharedb'; +import { createV2NodePgContainer } from '@teable/v2-container-node'; +import type { DependencyContainer } from '@teable/v2-di' with { 'resolution-mode': 'import' }; +import { PinoLogger } from 'nestjs-pino'; +import { ShareDbService } from '../../share-db/share-db.service'; +import { CommandBusTracingMiddleware } from './v2-command-bus-tracing.middleware'; +import { PinoLoggerAdapter } from './v2-logger.adapter'; +import { QueryBusTracingMiddleware } from './v2-query-bus-tracing.middleware'; +import { OpenTelemetryTracer } from './v2-tracer.adapter'; + +@Injectable() +export class V2ContainerService implements OnModuleDestroy { + private containerPromise?: Promise<DependencyContainer>; + + constructor( + private readonly configService: ConfigService, + private readonly pinoLogger: PinoLogger, + private readonly shareDbService: ShareDbService + ) {} + + async getContainer(): Promise<DependencyContainer> { + if (!this.containerPromise) { + const connectionString = this.configService.getOrThrow<string>('PRISMA_DATABASE_URL'); + const logger = new PinoLoggerAdapter(this.pinoLogger); + const tracer = new OpenTelemetryTracer(); + const commandBusMiddlewares = [new CommandBusTracingMiddleware()]; + const queryBusMiddlewares = [new QueryBusTracingMiddleware()]; + this.containerPromise = createV2NodePgContainer({ + connectionString, + logger, + tracer, + commandBusMiddlewares, + queryBusMiddlewares, + }).then((container) => { + registerV2ShareDbRealtime(container, { + publisher: new ShareDbPubSubPublisher(this.shareDbService.pubsub), + }); + return container; + }); + } + + return this.containerPromise; + } + + async onModuleDestroy(): Promise<void> { + if (!this.containerPromise) return; + + const container = await this.containerPromise; + const db = container.resolve<{ destroy(): Promise<void> }>(v2PostgresDbTokens.db); + await db.destroy(); + } +} diff --git a/apps/nestjs-backend/src/features/v2/v2-logger.adapter.ts b/apps/nestjs-backend/src/features/v2/v2-logger.adapter.ts new file mode 100644 index 0000000000..6e0d870bae --- /dev/null +++ b/apps/nestjs-backend/src/features/v2/v2-logger.adapter.ts @@ -0,0 +1,47 @@ +import type { ILogger, LogContext } from '@teable/v2-core'; +import type { PinoLogger } from 'nestjs-pino'; + +export class PinoLoggerAdapter implements ILogger { + constructor(private readonly logger: PinoLogger) {} + + debug(message: string, context?: LogContext): void { + if (context) { + this.logger.debug(context, message); + return; + } + this.logger.debug(message); + } + + info(message: string, context?: LogContext): void { + if (context) { + this.logger.info(context, message); + return; + } + this.logger.info(message); + } + + warn(message: string, context?: LogContext): void { + if (context) { + this.logger.warn(context, message); + return; + } + this.logger.warn(message); + } + + error(message: string, context?: LogContext): void { + if (context) { + this.logger.error(context, message); + return; + } + this.logger.error(message); + } + + child(context: LogContext): ILogger { + this.logger.logger.child(context); + return this; + } + + scope(): ILogger { + throw new Error('Not implemented'); + } +} diff --git a/apps/nestjs-backend/src/features/v2/v2-openapi.controller.ts b/apps/nestjs-backend/src/features/v2/v2-openapi.controller.ts new file mode 100644 index 0000000000..a8ae2f9d09 --- /dev/null +++ b/apps/nestjs-backend/src/features/v2/v2-openapi.controller.ts @@ -0,0 +1,84 @@ +/* eslint-disable @typescript-eslint/naming-convention */ +import { randomBytes } from 'crypto'; +import { Controller, Get, Header, Req, Res } from '@nestjs/common'; +import { ConfigService } from '@nestjs/config'; +import { generateV2OpenApiDocument } from '@teable/v2-contract-http-openapi'; +import { Request, Response } from 'express'; +import type { IBaseConfig } from '../../configs/base.config'; +import { Public } from '../auth/decorators/public.decorator'; + +const V2_BASE_PATH = 'api/v2'; +const OPENAPI_SPEC_PATH = `/${V2_BASE_PATH}/openapi.json`; +const SCALAR_CDN_ORIGIN = 'https://cdn.jsdelivr.net'; + +const buildServerUrl = (baseConfig: IBaseConfig | undefined, req: Request): string | undefined => { + const publicOrigin = baseConfig?.publicOrigin; + if (publicOrigin) return publicOrigin; + + const host = req.get('host'); + if (!host) return undefined; + + return `${req.protocol}://${host}`; +}; + +const buildDocsCsp = (nonce: string): string => + [ + "default-src 'self'", + "base-uri 'self'", + "frame-ancestors 'self'", + "object-src 'none'", + "img-src 'self' data: https:", + "font-src 'self' data: https:", + "style-src 'self' https: 'unsafe-inline'", + "connect-src 'self'", + `script-src 'self' ${SCALAR_CDN_ORIGIN} 'nonce-${nonce}'`, + `script-src-elem 'self' ${SCALAR_CDN_ORIGIN} 'nonce-${nonce}'`, + "script-src-attr 'none'", + ].join('; '); + +const buildScalarHtml = (specUrl: string, nonce: string): string => `<!doctype html> +<html> + <head> + <title>Teable v2 API + + + + +
+ + + + + +`; + +@Public() +@Controller(V2_BASE_PATH) +export class V2OpenApiController { + constructor(private readonly configService: ConfigService) {} + + @Get('openapi.json') + @Header('Content-Type', 'application/json') + async openapi(@Req() req: Request) { + const baseConfig = this.configService.get('base'); + const serverUrl = buildServerUrl(baseConfig, req); + + const serverBaseUrl = serverUrl ? `${serverUrl.replace(/\/$/, '')}/${V2_BASE_PATH}` : undefined; + + return generateV2OpenApiDocument({ + servers: serverBaseUrl ? [{ url: serverBaseUrl }] : undefined, + }); + } + + @Get('docs') + @Header('Content-Type', 'text/html; charset=utf-8') + docs(@Res({ passthrough: true }) res: Response) { + const nonce = randomBytes(16).toString('base64'); + res.setHeader('Content-Security-Policy', buildDocsCsp(nonce)); + return buildScalarHtml(OPENAPI_SPEC_PATH, nonce); + } +} diff --git a/apps/nestjs-backend/src/features/v2/v2-query-bus-tracing.middleware.ts b/apps/nestjs-backend/src/features/v2/v2-query-bus-tracing.middleware.ts new file mode 100644 index 0000000000..616667fc25 --- /dev/null +++ b/apps/nestjs-backend/src/features/v2/v2-query-bus-tracing.middleware.ts @@ -0,0 +1,43 @@ +import type { QueryBusNext, IQueryBusMiddleware, IExecutionContext } from '@teable/v2-core'; + +const describeError = (error: unknown): string => { + if (error instanceof Error) return error.message || error.name; + if (typeof error === 'string') return error; + try { + return JSON.stringify(error) ?? String(error); + } catch { + return String(error); + } +}; + +export class QueryBusTracingMiddleware implements IQueryBusMiddleware { + async handle( + context: IExecutionContext, + query: TQuery, + next: QueryBusNext + ) { + const tracer = context.tracer; + if (!tracer) { + return next(context, query); + } + + const queryName = + (query as { constructor?: { name?: string } }).constructor?.name ?? 'UnknownQuery'; + const span = tracer.startSpan(`teable.query.${queryName}`, { + query: queryName, + }); + + try { + const result = await next(context, query); + if (result.isErr()) { + span.recordError(result.error.message ?? 'Unknown error'); + } + return result; + } catch (error) { + span.recordError(describeError(error)); + throw error; + } finally { + span.end(); + } + } +} diff --git a/apps/nestjs-backend/src/features/v2/v2-tracer.adapter.ts b/apps/nestjs-backend/src/features/v2/v2-tracer.adapter.ts new file mode 100644 index 0000000000..349ad62d66 --- /dev/null +++ b/apps/nestjs-backend/src/features/v2/v2-tracer.adapter.ts @@ -0,0 +1,47 @@ +import type { Span as ApiSpan } from '@opentelemetry/api'; +import { SpanStatusCode, context as otelContext, trace } from '@opentelemetry/api'; +import type { ISpan, ITracer, SpanAttributeValue, SpanAttributes } from '@teable/v2-core'; + +class OpenTelemetrySpan implements ISpan { + constructor(public readonly span: ApiSpan) {} + + setAttribute(key: string, value: SpanAttributeValue): void { + this.span.setAttribute(key, value); + } + + setAttributes(attributes: SpanAttributes): void { + this.span.setAttributes(attributes); + } + + recordError(message: string): void { + this.span.recordException(message); + this.span.setStatus({ code: SpanStatusCode.ERROR, message }); + } + + end(): void { + this.span.end(); + } +} + +export class OpenTelemetryTracer implements ITracer { + constructor(private readonly name = 'v2-core') {} + + startSpan(name: string, attributes?: SpanAttributes): ISpan { + const tracer = trace.getTracer(this.name); + const span = tracer.startSpan(name, { attributes }, otelContext.active()); + return new OpenTelemetrySpan(span); + } + + async withSpan(span: ISpan, callback: () => Promise): Promise { + if (span instanceof OpenTelemetrySpan) { + return otelContext.with(trace.setSpan(otelContext.active(), span.span), callback); + } + return callback(); + } + + getActiveSpan(): ISpan | undefined { + const span = trace.getActiveSpan(); + if (!span) return undefined; + return new OpenTelemetrySpan(span); + } +} diff --git a/apps/nestjs-backend/src/features/v2/v2.controller.ts b/apps/nestjs-backend/src/features/v2/v2.controller.ts new file mode 100644 index 0000000000..81ed802eb9 --- /dev/null +++ b/apps/nestjs-backend/src/features/v2/v2.controller.ts @@ -0,0 +1,80 @@ +/* eslint-disable @typescript-eslint/ban-ts-comment */ +// @ts-nocheck +import { Controller } from '@nestjs/common'; +import { Implement, implement, ORPCError } from '@orpc/nest'; +import { v2Contract } from '@teable/v2-contract-http'; +import { + executeCreateTableEndpoint, + executeGetTableByIdEndpoint, +} from '@teable/v2-contract-http-implementation/handlers'; +import { ActorId, v2CoreTokens } from '@teable/v2-core'; +import type { + IQueryBus, + ICommandBus, + ITracer, +} from '@teable/v2-core' with { 'resolution-mode': 'import' }; +import { ClsService } from 'nestjs-cls'; +import type { IClsStore } from '../../types/cls'; +import { V2ContainerService } from './v2-container.service'; + +@Controller('api/v2') +export class V2Controller { + constructor( + private readonly v2Container: V2ContainerService, + private readonly cls: ClsService + ) {} + + @Implement(v2Contract.tables) + tables() { + return { + create: implement(v2Contract.tables.create).handler(async ({ input }) => { + const container = await this.v2Container.getContainer(); + const commandBus = container.resolve(v2CoreTokens.commandBus); + const tracer = container.resolve(v2CoreTokens.tracer); + const actorIdResult = ActorId.create(this.cls.get('user.id')); + if (actorIdResult.isErr()) { + throw new ORPCError('INTERNAL_SERVER_ERROR', { message: actorIdResult.error }); + } + const result = await executeCreateTableEndpoint( + { actorId: actorIdResult.value, tracer }, + input, + commandBus + ); + + if (result.status === 201) return result.body; + + if (result.status === 400) { + throw new ORPCError('BAD_REQUEST', { message: result.body.error }); + } + + throw new ORPCError('INTERNAL_SERVER_ERROR', { message: result.body.error }); + }), + getById: implement(v2Contract.tables.getById).handler(async ({ input }) => { + const container = await this.v2Container.getContainer(); + const queryBus = container.resolve(v2CoreTokens.queryBus); + const tracer = container.resolve(v2CoreTokens.tracer); + const actorIdResult = ActorId.create(this.cls.get('user.id')); + if (actorIdResult.isErr()) { + throw new ORPCError('INTERNAL_SERVER_ERROR', { message: actorIdResult.error }); + } + const result = await executeGetTableByIdEndpoint( + { actorId: actorIdResult.value, tracer }, + input, + queryBus + ); + if (result.status === 200) return result.body; + + if (result.status === 400) { + throw new ORPCError('BAD_REQUEST', { message: result.body.error }); + } + + if (result.status === 404) { + throw new ORPCError('NOT_FOUND', { message: result.body.error }); + } + + // Placeholder for actual implementation + throw new ORPCError('NOT_IMPLEMENTED', { message: 'Not implemented yet' }); + }), + }; + } +} diff --git a/apps/nestjs-backend/src/features/v2/v2.module.ts b/apps/nestjs-backend/src/features/v2/v2.module.ts new file mode 100644 index 0000000000..33beacf50d --- /dev/null +++ b/apps/nestjs-backend/src/features/v2/v2.module.ts @@ -0,0 +1,85 @@ +import { Module } from '@nestjs/common'; +import { ORPCModule } from '@orpc/nest'; +import type { Response } from 'express'; +import { ShareDbModule } from '../../share-db/share-db.module'; +import { V2ContainerService } from './v2-container.service'; +import { V2OpenApiController } from './v2-openapi.controller'; + +const isRecord = (value: unknown): value is Record => + typeof value === 'object' && value !== null; + +const formatIssuePath = (path: unknown): string => { + if (typeof path === 'string') return path; + if (!Array.isArray(path) || path.length === 0) return ''; + + let formatted = ''; + for (const segment of path) { + if (typeof segment === 'number') { + formatted += `[${segment}]`; + continue; + } + const text = String(segment); + formatted = formatted ? `${formatted}.${text}` : text; + } + + return formatted; +}; + +const formatIssue = (issue: unknown): string | null => { + if (!isRecord(issue)) return null; + + const message = typeof issue.message === 'string' ? issue.message : ''; + const path = formatIssuePath(issue.path); + + if (message && path) return `${path}: ${message}`; + if (message) return message; + if (path) return path; + return null; +}; + +const formatIssues = (data: unknown): string[] => { + if (!isRecord(data)) return []; + const issues = data.issues; + if (!Array.isArray(issues)) return []; + + return issues.map(formatIssue).filter((issue): issue is string => Boolean(issue)); +}; + +const toErrorMessage = (body: unknown): string => { + if (typeof body === 'string') return body; + if (!isRecord(body)) return 'Unexpected error'; + + const message = typeof body.message === 'string' ? body.message : 'Unexpected error'; + const issues = formatIssues(body.data); + if (issues.length > 0) return `${message}: ${issues.join('; ')}`; + + return message; +}; + +@Module({ + imports: [ + ORPCModule.forRoot({ + context: {}, + sendResponseInterceptors: [ + async ({ response, standardResponse, next }) => { + if (standardResponse.status < 400) return next(); + + const expressResponse = response as Response; + expressResponse.status(standardResponse.status); + for (const [key, value] of Object.entries(standardResponse.headers)) { + if (value !== undefined) { + expressResponse.setHeader(key, value); + } + } + + return { ok: false as const, error: toErrorMessage(standardResponse.body) }; + }, + ], + }), + ShareDbModule, + ], + controllers: [V2OpenApiController], + providers: [V2ContainerService], + exports: [V2ContainerService], +}) +export class V2Module {} diff --git a/apps/nestjs-backend/src/tracing.ts b/apps/nestjs-backend/src/tracing.ts index 7c3f4583e9..9113d4a5c5 100644 --- a/apps/nestjs-backend/src/tracing.ts +++ b/apps/nestjs-backend/src/tracing.ts @@ -6,6 +6,7 @@ import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http'; import { ExpressInstrumentation, ExpressLayerType } from '@opentelemetry/instrumentation-express'; import { HttpInstrumentation } from '@opentelemetry/instrumentation-http'; import { NestInstrumentation } from '@opentelemetry/instrumentation-nestjs-core'; +import { PgInstrumentation } from '@opentelemetry/instrumentation-pg'; import { PinoInstrumentation } from '@opentelemetry/instrumentation-pino'; import { resourceFromAttributes } from '@opentelemetry/resources'; import * as opentelemetry from '@opentelemetry/sdk-node'; @@ -124,6 +125,9 @@ const otelSDK = new opentelemetry.NodeSDK({ }), new NestInstrumentation(), new PrismaInstrumentation(), + new PgInstrumentation({ + enhancedDatabaseReporting: true, // Records SQL; ensure sensitive data is scrubbed. + }), new PinoInstrumentation(), ], resource: resourceFromAttributes({ diff --git a/apps/nestjs-backend/src/ws/ws.gateway.ts b/apps/nestjs-backend/src/ws/ws.gateway.ts index 7bd83280d7..f7e47686a2 100644 --- a/apps/nestjs-backend/src/ws/ws.gateway.ts +++ b/apps/nestjs-backend/src/ws/ws.gateway.ts @@ -1,132 +1,45 @@ -import type http from 'http'; -import type { AdaptableWebSocket } from '@an-epiphany/websocket-json-stream'; -import { WebSocketJSONStream } from '@an-epiphany/websocket-json-stream'; -import type { OnModuleDestroy, OnModuleInit } from '@nestjs/common'; -import { Injectable, Logger } from '@nestjs/common'; -import { HttpAdapterHost } from '@nestjs/core'; +import { Logger } from '@nestjs/common'; +import type { OnGatewayConnection, OnGatewayDisconnect, OnGatewayInit } from '@nestjs/websockets'; +import { WebSocketGateway } from '@nestjs/websockets'; +import { ShareDbWebSocketServer } from '@teable/v2-adapter-realtime-sharedb'; import type { Request } from 'express'; -import sockjs from 'sockjs'; +import type WebSocketLib from 'ws'; import { ShareDbService } from '../share-db/share-db.service'; -@Injectable() -export class WsGateway implements OnModuleInit, OnModuleDestroy { +@WebSocketGateway({ path: '/socket', perMessageDeflate: true }) +export class WsGateway implements OnGatewayInit, OnGatewayConnection, OnGatewayDisconnect { private logger = new Logger(WsGateway.name); - private sockjsServer: sockjs.Server | null = null; - private readonly activeConnections = new Set(); + private readonly shareDbWebSocket: ShareDbWebSocketServer; - constructor( - private readonly shareDb: ShareDbService, - private readonly httpAdapterHost: HttpAdapterHost - ) {} - - onModuleInit() { - const httpServer = this.httpAdapterHost.httpAdapter.getHttpServer() as http.Server; - - // SockJS server configuration for collaborative data sync (similar to Airtable) - // - transports: Only websocket and xhr-streaming (xhr-polling excluded for performance) - // - response_limit: 1MB to handle large batch operations (table sync, bulk row updates) - this.sockjsServer = sockjs.createServer({ - prefix: '/socket', - transports: ['websocket', 'xhr-streaming'], - response_limit: 2 * 1024 * 1024, // 2MB for large collaborative payloads - log: (severity: string, message: string) => { - if (severity === 'error') { - this.logger.error(message); - } else if (severity === 'info') { - this.logger.log(message); - } else { - this.logger.debug(message); - } - }, - // eslint-disable-next-line @typescript-eslint/naming-convention - } as sockjs.ServerOptions & { transports: string[]; response_limit: number }); - - this.sockjsServer.on('connection', this.handleConnection); - this.sockjsServer.installHandlers(httpServer); - this.logger.log('WsGateway (SockJS) initialized'); + constructor(private readonly shareDb: ShareDbService) { + this.shareDbWebSocket = new ShareDbWebSocketServer(shareDb); } - private handleConnection = (conn: sockjs.Connection) => { - if (!conn) return; + handleDisconnect() { + this.logger.log('ws:on:close'); + } - this.activeConnections.add(conn); - this.logger.log(`sockjs:on:connection (active: ${this.activeConnections.size})`); + handleConnection(client: unknown) { + this.logger.log('ws:on:connection', client); + } - // Handle connection close to clean up tracking - conn.on('close', () => { - this.activeConnections.delete(conn); - this.logger.log(`sockjs:on:close (active: ${this.activeConnections.size})`); + afterInit(server: WebSocketLib.Server) { + this.logger.log('WsGateway afterInit'); + server.on('connection', async (webSocket, request: Request) => { + try { + this.logger.log('ws:on:connection'); + this.shareDbWebSocket.handleConnection(webSocket, request); + } catch (error) { + webSocket.send(JSON.stringify({ error })); + webSocket.close(); + } }); - - try { - const stream = new WebSocketJSONStream(conn as unknown as AdaptableWebSocket, { - adapterType: 'sockjs-node', - }); - - // Extract request with headers (including cookies for auth) - const request = this.getRequestFromConnection(conn); - - this.shareDb.listen(stream, request); - } catch (error) { - this.logger.error('Connection error', error); - conn.write(JSON.stringify({ error })); - conn.close(); - this.activeConnections.delete(conn); - } - }; - - /** - * Extract HTTP request from SockJS connection. - * - * SockJS transports provide request access differently: - * - XHR (xhr-polling, xhr-streaming): Full request at _session.recv.request - * - WebSocket: Request stored in faye-websocket driver at _session.recv.ws._driver._request - * - * @see https://github.com/sockjs/sockjs-node/blob/main/lib/transport/response-receiver.js - * @see https://github.com/sockjs/sockjs-node/blob/main/lib/transport/websocket.js - * @see https://github.com/faye/faye-websocket-node (uses websocket-driver internally) - */ - private getRequestFromConnection(conn: sockjs.Connection): Request { - // eslint-disable-next-line @typescript-eslint/no-explicit-any - const recv = (conn as any)?._session?.recv; - - // XHR transports: ResponseReceiver stores full request with cookies - if (recv?.request) { - return recv.request as Request; - } - - // WebSocket transport: FayeWebsocket stores request in driver._request - // Path: recv.ws (FayeWebsocket) -> _driver (Hybi/Base) -> _request (IncomingMessage) - const wsRequest = recv?.ws?._driver?._request; - if (wsRequest) { - return wsRequest as Request; - } - - // Fallback: use connection's url and headers (no cookies) - this.logger.warn( - `Could not find original request for connection (protocol: ${conn.protocol}), falling back to filtered headers` - ); - return { - url: conn.url || '/socket', - headers: conn.headers || {}, - } as unknown as Request; } async onModuleDestroy() { try { - this.logger.log('Starting graceful shutdown...'); - - // Terminate all active connections - for (const conn of this.activeConnections) { - try { - conn.close(); - } catch { - // Ignore errors during connection close - } - } - this.activeConnections.clear(); - - // Close ShareDb + this.logger.log('Starting graceful shutdown....'); + // 修改 ShareDB 关闭方式为回调形式 await new Promise((resolve, reject) => { this.shareDb.close((err) => { if (err) { @@ -137,12 +50,9 @@ export class WsGateway implements OnModuleInit, OnModuleDestroy { }); }); - // Clean up sockjs server reference - this.sockjsServer = null; - - this.logger.log('Graceful shutdown completed'); + this.logger.log('Graceful had shutdown completed'); } catch (err) { - this.logger.error('Module close error: ' + (err as Error).message, (err as Error)?.stack); + this.logger.error('dev module close error: ' + (err as Error).message, (err as Error)?.stack); } } } diff --git a/apps/nestjs-backend/tsconfig.eslint.json b/apps/nestjs-backend/tsconfig.eslint.json index 49d4ca92e9..aa1f6f3b77 100644 --- a/apps/nestjs-backend/tsconfig.eslint.json +++ b/apps/nestjs-backend/tsconfig.eslint.json @@ -3,10 +3,11 @@ "extends": "../../tsconfig.base.json", "compilerOptions": { "target": "es6", - "moduleResolution": "Node", - "module": "CommonJS", + "module": "ESNext", + "moduleResolution": "bundler", "emitDecoratorMetadata": true, "experimentalDecorators": true, + "isolatedModules": false, "noEmit": false, "allowJs": false }, diff --git a/apps/nestjs-backend/tsconfig.json b/apps/nestjs-backend/tsconfig.json index 67e07dc802..0d45e024e8 100644 --- a/apps/nestjs-backend/tsconfig.json +++ b/apps/nestjs-backend/tsconfig.json @@ -2,13 +2,14 @@ "$schema": "https://json.schemastore.org/tsconfig", "extends": "../../tsconfig.base.json", "compilerOptions": { + "module": "ESNext", + "moduleResolution": "bundler", "emitDecoratorMetadata": true, "experimentalDecorators": true, + "isolatedModules": false, "target": "es2022", - "moduleResolution": "Node", "declaration": true, "declarationDir": "./dist", - "module": "CommonJS", "noEmit": false, "sourceMap": true, "allowJs": false, @@ -16,7 +17,12 @@ "paths": { "@teable/core": ["../../packages/core/src"], "@teable/openapi": ["../../packages/openapi/src"], - "@teable/db-main-prisma": ["../../packages/db-main-prisma/src"] + "@teable/db-main-prisma": ["../../packages/db-main-prisma/src"], + "@teable/v2-*": ["../../packages/v2/*/src/index"], + "@teable/v2-contract-http-implementation/handlers": [ + "../../packages/v2/contract-http-implementation/src/handlers/index.ts" + ], + "@teable/formula": ["../../packages/formula/src"] }, "types": ["vitest/globals", "node"] }, diff --git a/apps/nestjs-backend/vitest-bench.config.ts b/apps/nestjs-backend/vitest-bench.config.ts index 56bc66438e..3ab878ec0c 100644 --- a/apps/nestjs-backend/vitest-bench.config.ts +++ b/apps/nestjs-backend/vitest-bench.config.ts @@ -1,5 +1,7 @@ +/* eslint-disable @typescript-eslint/naming-convention */ import swc from 'unplugin-swc'; import tsconfigPaths from 'vite-tsconfig-paths'; +import type { Plugin } from 'vitest/config'; import { configDefaults, defineConfig } from 'vitest/config'; const benchFiles = ['**/test/**/*.bench.{js,ts}']; @@ -10,7 +12,7 @@ export default defineConfig({ jsc: { target: 'es2022', }, - }), + }) as unknown as Plugin, tsconfigPaths(), ], cacheDir: '../../.cache/vitest/nestjs-backend/bench', diff --git a/apps/nestjs-backend/webpack.swc.js b/apps/nestjs-backend/webpack.swc.js index 87bdd2dd4f..9e265edcdd 100644 --- a/apps/nestjs-backend/webpack.swc.js +++ b/apps/nestjs-backend/webpack.swc.js @@ -15,6 +15,21 @@ module.exports = function (options, webpack) { return { ...options, + resolve: { + ...options.resolve, + conditionNames: (() => { + const base = options.resolve?.conditionNames ?? ['require', 'node', 'default']; + if (base.includes('import')) return base; + const next = [...base]; + const defaultIndex = next.indexOf('default'); + if (defaultIndex === -1) { + next.push('import'); + } else { + next.splice(defaultIndex, 0, 'import'); + } + return next; + })(), + }, entry: { index: ['webpack/hot/poll?100', options.entry], ...workerEntries, @@ -27,7 +42,7 @@ module.exports = function (options, webpack) { devtool: 'eval-cheap-module-source-map', externals: [ nodeExternals({ - allowlist: ['webpack/hot/poll?100', /^@teable/], + allowlist: ['webpack/hot/poll?100', /^@teable/, /^@orpc/], }), ], // ignore tests hot reload diff --git a/apps/nextjs-app/tsconfig.json b/apps/nextjs-app/tsconfig.json index 0054204429..e2e1354678 100644 --- a/apps/nextjs-app/tsconfig.json +++ b/apps/nextjs-app/tsconfig.json @@ -31,7 +31,8 @@ "@teable/db-main-prisma": ["../../../packages/db-main-prisma/src/index"], "@teable/core": ["../../../packages/core/src/index"], "@teable/openapi": ["../../../packages/openapi/src/index"], - "@teable/icons": ["../../../packages/icons/src/index"] + "@teable/icons": ["../../../packages/icons/src/index"], + "@teable/formula": ["../../../packages/formula/src/index"] }, "plugins": [ { diff --git a/apps/playground/.cta.json b/apps/playground/.cta.json new file mode 100644 index 0000000000..eb9b9d20e2 --- /dev/null +++ b/apps/playground/.cta.json @@ -0,0 +1,12 @@ +{ + "projectName": "playground", + "mode": "file-router", + "typescript": true, + "tailwind": true, + "packageManager": "pnpm", + "git": false, + "addOnOptions": {}, + "version": 1, + "framework": "react-cra", + "chosenAddOns": ["start", "shadcn", "oRPC", "nitro", "tanstack-query"] +} diff --git a/apps/playground/.cursorrules b/apps/playground/.cursorrules new file mode 100644 index 0000000000..2015960219 --- /dev/null +++ b/apps/playground/.cursorrules @@ -0,0 +1,7 @@ +# shadcn instructions + +Use the latest version of Shadcn to install new components, like this command to add a button component: + +```bash +pnpm dlx shadcn@latest add button +``` diff --git a/apps/playground/.env.development b/apps/playground/.env.development new file mode 100644 index 0000000000..d9cac1307d --- /dev/null +++ b/apps/playground/.env.development @@ -0,0 +1,13 @@ +DATABASE_URL=postgres://teable:teable@localhost:5432/teable +OTEL_SERVICE_NAME=teable-playground +OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318/v1/traces +OTEL_EXPORTER_OTLP_HEADERS= +BUILD_VERSION=local +LOG_LEVEL=debug +VITE_OTEL_SERVICE_NAME=teable-playground-web +VITE_OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318/v1/traces +VITE_OTEL_EXPORTER_OTLP_HEADERS= +VITE_BUILD_VERSION=local +VITE_APP_URL=http://localhost:3000 +VITE_LOG_LEVEL=debug +PLAYGROUND_TRACE_LINK_BASE_URL=http://localhost:16686 diff --git a/apps/playground/.gitignore b/apps/playground/.gitignore new file mode 100644 index 0000000000..6221ecbd0e --- /dev/null +++ b/apps/playground/.gitignore @@ -0,0 +1,13 @@ +node_modules +.DS_Store +dist +dist-ssr +*.local +count.txt +.env +.nitro +.tanstack +.wrangler +.output +.vinxi +todos.json diff --git a/apps/playground/.vscode/settings.json b/apps/playground/.vscode/settings.json new file mode 100644 index 0000000000..00b5278e58 --- /dev/null +++ b/apps/playground/.vscode/settings.json @@ -0,0 +1,11 @@ +{ + "files.watcherExclude": { + "**/routeTree.gen.ts": true + }, + "search.exclude": { + "**/routeTree.gen.ts": true + }, + "files.readonlyInclude": { + "**/routeTree.gen.ts": true + } +} diff --git a/apps/playground/README.md b/apps/playground/README.md new file mode 100644 index 0000000000..8f81e27fc6 --- /dev/null +++ b/apps/playground/README.md @@ -0,0 +1,297 @@ +Welcome to your new TanStack app! + +## Playground Hook Notes (AI) + +- Prefer hooks from `usehooks-ts` before writing custom hooks. +- Clipboard: use `useCopyToClipboard`. +- Debounce: use `useDebounceValue` or `useDebounceCallback`. + +# Getting Started + +To run this application: + +```bash +pnpm install +pnpm start +``` + +# Building For Production + +To build this application for production: + +```bash +pnpm build +``` + +## Testing + +This project uses [Vitest](https://vitest.dev/) for testing. You can run the tests with: + +```bash +pnpm test +``` + +## Styling + +This project uses [Tailwind CSS](https://tailwindcss.com/) for styling. + +## Shadcn + +Add components using the latest version of [Shadcn](https://ui.shadcn.com/). + +```bash +pnpm dlx shadcn@latest add button +``` + +## Routing + +This project uses [TanStack Router](https://tanstack.com/router). The initial setup is a file based router. Which means that the routes are managed as files in `src/routes`. + +### Adding A Route + +To add a new route to your application just add another a new file in the `./src/routes` directory. + +TanStack will automatically generate the content of the route file for you. + +Now that you have two routes you can use a `Link` component to navigate between them. + +### Adding Links + +To use SPA (Single Page Application) navigation you will need to import the `Link` component from `@tanstack/react-router`. + +```tsx +import { Link } from "@tanstack/react-router"; +``` + +Then anywhere in your JSX you can use it like so: + +```tsx +About +``` + +This will create a link that will navigate to the `/about` route. + +More information on the `Link` component can be found in the [Link documentation](https://tanstack.com/router/v1/docs/framework/react/api/router/linkComponent). + +### Using A Layout + +In the File Based Routing setup the layout is located in `src/routes/__root.tsx`. Anything you add to the root route will appear in all the routes. The route content will appear in the JSX where you use the `` component. + +Here is an example layout that includes a header: + +```tsx +import { Outlet, createRootRoute } from "@tanstack/react-router"; +import { TanStackRouterDevtools } from "@tanstack/react-router-devtools"; + +import { Link } from "@tanstack/react-router"; + +export const Route = createRootRoute({ + component: () => ( + <> +
+ +
+ + + + ), +}); +``` + +The `` component is not required so you can remove it if you don't want it in your layout. + +More information on layouts can be found in the [Layouts documentation](https://tanstack.com/router/latest/docs/framework/react/guide/routing-concepts#layouts). + +## Data Fetching + +There are multiple ways to fetch data in your application. You can use TanStack Query to fetch data from a server. But you can also use the `loader` functionality built into TanStack Router to load the data for a route before it's rendered. + +For example: + +```tsx +const peopleRoute = createRoute({ + getParentRoute: () => rootRoute, + path: "/people", + loader: async () => { + const response = await fetch("https://swapi.dev/api/people"); + return response.json() as Promise<{ + results: { + name: string; + }[]; + }>; + }, + component: () => { + const data = peopleRoute.useLoaderData(); + return ( +
    + {data.results.map((person) => ( +
  • {person.name}
  • + ))} +
+ ); + }, +}); +``` + +Loaders simplify your data fetching logic dramatically. Check out more information in the [Loader documentation](https://tanstack.com/router/latest/docs/framework/react/guide/data-loading#loader-parameters). + +### React-Query + +React-Query is an excellent addition or alternative to route loading and integrating it into you application is a breeze. + +First add your dependencies: + +```bash +pnpm add @tanstack/react-query @tanstack/react-query-devtools +``` + +Next we'll need to create a query client and provider. We recommend putting those in `main.tsx`. + +```tsx +import { QueryClient, QueryClientProvider } from "@tanstack/react-query"; + +// ... + +const queryClient = new QueryClient(); + +// ... + +if (!rootElement.innerHTML) { + const root = ReactDOM.createRoot(rootElement); + + root.render( + + + + ); +} +``` + +You can also add TanStack Query Devtools to the root route (optional). + +```tsx +import { ReactQueryDevtools } from "@tanstack/react-query-devtools"; + +const rootRoute = createRootRoute({ + component: () => ( + <> + + + + + ), +}); +``` + +Now you can use `useQuery` to fetch your data. + +```tsx +import { useQuery } from "@tanstack/react-query"; + +import "./App.css"; + +function App() { + const { data } = useQuery({ + queryKey: ["people"], + queryFn: () => + fetch("https://swapi.dev/api/people") + .then((res) => res.json()) + .then((data) => data.results as { name: string }[]), + initialData: [], + }); + + return ( +
+
    + {data.map((person) => ( +
  • {person.name}
  • + ))} +
+
+ ); +} + +export default App; +``` + +You can find out everything you need to know on how to use React-Query in the [React-Query documentation](https://tanstack.com/query/latest/docs/framework/react/overview). + +## State Management + +Another common requirement for React applications is state management. There are many options for state management in React. TanStack Store provides a great starting point for your project. + +First you need to add TanStack Store as a dependency: + +```bash +pnpm add @tanstack/store +``` + +Now let's create a simple counter in the `src/App.tsx` file as a demonstration. + +```tsx +import { useStore } from "@tanstack/react-store"; +import { Store } from "@tanstack/store"; +import "./App.css"; + +const countStore = new Store(0); + +function App() { + const count = useStore(countStore); + return ( +
+ +
+ ); +} + +export default App; +``` + +One of the many nice features of TanStack Store is the ability to derive state from other state. That derived state will update when the base state updates. + +Let's check this out by doubling the count using derived state. + +```tsx +import { useStore } from "@tanstack/react-store"; +import { Store, Derived } from "@tanstack/store"; +import "./App.css"; + +const countStore = new Store(0); + +const doubledStore = new Derived({ + fn: () => countStore.state * 2, + deps: [countStore], +}); +doubledStore.mount(); + +function App() { + const count = useStore(countStore); + const doubledCount = useStore(doubledStore); + + return ( +
+ +
Doubled - {doubledCount}
+
+ ); +} + +export default App; +``` + +We use the `Derived` class to create a new store that is derived from another store. The `Derived` class has a `mount` method that will start the derived store updating. + +Once we've created the derived store we can use it in the `App` component just like we would any other store using the `useStore` hook. + +You can find out everything you need to know on how to use TanStack Store in the [TanStack Store documentation](https://tanstack.com/store/latest). + +# Demo files + +Files prefixed with `demo` can be safely deleted. They are there to provide a starting point for you to play around with the features you've installed. + +# Learn More + +You can learn more about all of the offerings from TanStack in the [TanStack documentation](https://tanstack.com). diff --git a/apps/playground/components.json b/apps/playground/components.json new file mode 100644 index 0000000000..58bb3a2732 --- /dev/null +++ b/apps/playground/components.json @@ -0,0 +1,21 @@ +{ + "$schema": "https://ui.shadcn.com/schema.json", + "style": "new-york", + "rsc": false, + "tsx": true, + "tailwind": { + "config": "", + "css": "src/styles.css", + "baseColor": "zinc", + "cssVariables": true, + "prefix": "" + }, + "aliases": { + "components": "@/components", + "utils": "@/lib/utils", + "ui": "@/components/ui", + "lib": "@/lib", + "hooks": "@/hooks" + }, + "iconLibrary": "lucide" +} diff --git a/apps/playground/package.json b/apps/playground/package.json new file mode 100644 index 0000000000..1ec595c511 --- /dev/null +++ b/apps/playground/package.json @@ -0,0 +1,120 @@ +{ + "name": "playground", + "private": true, + "type": "module", + "scripts": { + "dev": "vite dev", + "build": "vite build", + "preview": "vite preview", + "test": "vitest run" + }, + "dependencies": { + "@electric-sql/pglite": "0.3.14", + "@hookform/resolvers": "3.3.4", + "@opentelemetry/api": "1.9.0", + "@opentelemetry/exporter-trace-otlp-http": "0.201.1", + "@opentelemetry/instrumentation": "0.201.1", + "@opentelemetry/instrumentation-fetch": "0.201.1", + "@opentelemetry/instrumentation-pg": "0.49.0", + "@opentelemetry/resources": "2.0.1", + "@opentelemetry/sdk-node": "0.201.1", + "@opentelemetry/sdk-trace-base": "2.0.1", + "@opentelemetry/sdk-trace-web": "2.0.1", + "@opentelemetry/semantic-conventions": "1.34.0", + "@orpc/client": "^1.13.0", + "@orpc/contract": "^1.13.0", + "@orpc/experimental-pino": "^1.13.0", + "@orpc/otel": "^1.13.0", + "@orpc/server": "^1.13.0", + "@orpc/tanstack-query": "^1.13.0", + "@radix-ui/react-alert-dialog": "1.0.5", + "@radix-ui/react-checkbox": "1.0.4", + "@radix-ui/react-context-menu": "2.1.5", + "@radix-ui/react-dialog": "1.0.5", + "@radix-ui/react-dropdown-menu": "2.0.6", + "@radix-ui/react-icons": "1.3.0", + "@radix-ui/react-label": "2.0.2", + "@radix-ui/react-popover": "1.0.7", + "@radix-ui/react-radio-group": "1.1.3", + "@radix-ui/react-scroll-area": "1.0.5", + "@radix-ui/react-select": "2.0.0", + "@radix-ui/react-separator": "1.0.3", + "@radix-ui/react-slider": "1.2.2", + "@radix-ui/react-slot": "1.0.2", + "@radix-ui/react-switch": "1.0.3", + "@radix-ui/react-tabs": "1.0.4", + "@radix-ui/react-tooltip": "1.0.7", + "@tailwindcss/vite": "^4.0.6", + "@tanstack/react-devtools": "^0.7.0", + "@tanstack/react-form": "^0.41.2", + "@tanstack/react-query": "^5.66.5", + "@tanstack/react-query-devtools": "^5.84.2", + "@tanstack/react-router": "^1.132.0", + "@tanstack/react-router-devtools": "^1.132.0", + "@tanstack/react-router-ssr-query": "^1.131.7", + "@tanstack/react-start": "^1.132.0", + "@tanstack/react-table": "8.11.7", + "@tanstack/router-plugin": "^1.132.0", + "@tanstack/zod-form-adapter": "^0.41.2", + "@teable/v2-adapter-db-postgres-pg": "workspace:^", + "@teable/v2-adapter-db-postgres-pglite": "workspace:^", + "@teable/v2-adapter-logger-pino": "workspace:^", + "@teable/v2-adapter-realtime-broadcastchannel": "workspace:^", + "@teable/v2-adapter-realtime-sharedb": "workspace:^", + "@teable/v2-container-browser": "workspace:^", + "@teable/v2-container-node": "workspace:^", + "@teable/v2-contract-http": "workspace:^", + "@teable/v2-contract-http-implementation": "workspace:^", + "@teable/v2-core": "workspace:^", + "@teable/v2-di": "workspace:^", + "@teable/v2-postgres-schema": "workspace:^", + "@teable/v2-table-templates": "workspace:^", + "@toon-format/toon": "2.1.0", + "class-variance-authority": "^0.7.1", + "clsx": "^2.1.1", + "cmdk": "^1.0.0", + "date-fns": "4.1.0", + "kysely": "0.28.9", + "lucide-react": "^0.544.0", + "nitro": "latest", + "nprogress": "0.2.0", + "nuqs": "^2.5.0", + "papaparse": "5.5.3", + "pino": "10.1.0", + "react": "^19.2.0", + "react-day-picker": "9.5.1", + "react-dom": "^19.2.0", + "react-hook-form": "7.51.1", + "react-json-view-lite": "^2.5.0", + "sharedb": "4.1.2", + "sonner": "1.7.3", + "sql-formatter": "^15.4.10", + "tailwind-merge": "^3.0.2", + "tailwindcss": "^4.0.6", + "ts-pattern": "5.1.1", + "tw-animate-css": "^1.3.6", + "usehooks-ts": "3.1.1", + "vite-tsconfig-paths": "^5.1.4", + "ws": "8.18.3", + "zod": "^4.1.8" + }, + "devDependencies": { + "@tanstack/devtools-vite": "^0.3.11", + "@testing-library/dom": "^10.4.0", + "@testing-library/react": "^16.2.0", + "@types/node": "^22.10.2", + "@types/nprogress": "0.2.3", + "@types/papaparse": "5.3.15", + "@types/react": "^19.2.0", + "@types/react-dom": "^19.2.0", + "@types/sharedb": "3.3.10", + "@types/ws": "8.5.12", + "@vitejs/plugin-react": "^5.0.4", + "jsdom": "^27.0.0", + "pino-pretty": "11.0.0", + "typescript": "^5.7.2", + "vite": "^7.1.7", + "vitest": "^3.0.5", + "web-vitals": "^5.1.0" + } +} diff --git a/apps/playground/public/favicon.ico b/apps/playground/public/favicon.ico new file mode 100644 index 0000000000..90916a8413 Binary files /dev/null and b/apps/playground/public/favicon.ico differ diff --git a/apps/playground/public/favicon.svg b/apps/playground/public/favicon.svg new file mode 100644 index 0000000000..f99282a062 --- /dev/null +++ b/apps/playground/public/favicon.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/apps/playground/src/components/playground/ComputedTasksPanel.tsx b/apps/playground/src/components/playground/ComputedTasksPanel.tsx new file mode 100644 index 0000000000..7e0e47695b --- /dev/null +++ b/apps/playground/src/components/playground/ComputedTasksPanel.tsx @@ -0,0 +1,520 @@ +import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query'; +import { + AlertTriangle, + CheckCircle2, + Clock, + Loader2, + Play, + RefreshCw, + RotateCcw, + Trash2, + XCircle, +} from 'lucide-react'; +import { useState } from 'react'; +import { toast } from 'sonner'; + +import { Badge } from '@/components/ui/badge'; +import { Button } from '@/components/ui/button'; +import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card'; +import { + Dialog, + DialogContent, + DialogDescription, + DialogHeader, + DialogTitle, +} from '@/components/ui/dialog'; +import { ScrollArea } from '@/components/ui/scroll-area'; +import { Tabs, TabsContent, TabsList, TabsTrigger } from '@/components/ui/tabs'; +import { + Table, + TableBody, + TableCell, + TableHead, + TableHeader, + TableRow, +} from '@/components/ui/table'; + +type OutboxTask = { + id: string; + baseId: string; + seedTableId: string; + status: string; + changeType: string; + attempts: number; + maxAttempts: number; + lastError: string | null; + planHash: string; + runId: string; + createdAt: string; + updatedAt: string; + nextRunAt: string; + seedCount: number; +}; + +type DeadLetter = { + id: string; + baseId: string; + seedTableId: string; + status: string; + changeType: string; + attempts: number; + maxAttempts: number; + lastError: string | null; + planHash: string; + runId: string; + failedAt: string; + createdAt: string; + traceData: unknown | null; + seedCount: number; +}; + +type TraceStep = { + stepIndex: number; + level: number; + tableId: string; + fieldId: string; + sql?: string; + paramCount?: number; + dirtyRecordCount?: number; + durationMs?: number; + error?: string; +}; + +type TraceData = { + requestId?: string; + taskId?: string; + steps?: TraceStep[]; + attempts?: number; + totalDurationMs?: number; + finalError?: string; +}; + +const formatDate = (dateStr: string) => { + return new Date(dateStr).toLocaleString(); +}; + +const StatusBadge = ({ status }: { status: string }) => { + switch (status) { + case 'pending': + return ( + + + Pending + + ); + case 'processing': + return ( + + + Processing + + ); + case 'completed': + return ( + + + Completed + + ); + case 'failed': + return ( + + + Failed + + ); + default: + return {status}; + } +}; + +export function ComputedTasksPanel() { + const queryClient = useQueryClient(); + const [selectedDeadLetter, setSelectedDeadLetter] = useState(null); + + const outboxQuery = useQuery({ + queryKey: ['computed-tasks', 'outbox'], + queryFn: async () => { + const response = await fetch('/api/computed-tasks/outbox'); + if (!response.ok) { + throw new Error('Failed to fetch outbox tasks'); + } + return response.json() as Promise<{ items: OutboxTask[]; total: number }>; + }, + refetchInterval: 5000, + }); + + const deadLettersQuery = useQuery({ + queryKey: ['computed-tasks', 'dead-letters'], + queryFn: async () => { + const response = await fetch('/api/computed-tasks/dead-letters'); + if (!response.ok) { + throw new Error('Failed to fetch dead letters'); + } + return response.json() as Promise<{ items: DeadLetter[]; total: number }>; + }, + refetchInterval: 10000, + }); + + const retryMutation = useMutation({ + mutationFn: async (taskId: string) => { + const response = await fetch(`/api/computed-tasks/${taskId}/retry-now`, { + method: 'POST', + }); + if (!response.ok) { + throw new Error('Failed to retry task'); + } + return response.json(); + }, + onSuccess: () => { + toast.success('Task queued for immediate retry'); + void queryClient.invalidateQueries({ queryKey: ['computed-tasks'] }); + }, + onError: (error) => { + toast.error(`Failed to retry: ${error.message}`); + }, + }); + + const replayMutation = useMutation({ + mutationFn: async (taskId: string) => { + const response = await fetch(`/api/computed-tasks/dead-letters/${taskId}/replay`, { + method: 'POST', + }); + if (!response.ok) { + throw new Error('Failed to replay dead letter'); + } + return response.json(); + }, + onSuccess: () => { + toast.success('Dead letter replayed successfully'); + void queryClient.invalidateQueries({ queryKey: ['computed-tasks'] }); + }, + onError: (error) => { + toast.error(`Failed to replay: ${error.message}`); + }, + }); + + const deleteMutation = useMutation({ + mutationFn: async (taskId: string) => { + const response = await fetch(`/api/computed-tasks/dead-letters/${taskId}`, { + method: 'DELETE', + }); + if (!response.ok) { + throw new Error('Failed to delete dead letter'); + } + return response.json(); + }, + onSuccess: () => { + toast.success('Dead letter deleted'); + void queryClient.invalidateQueries({ queryKey: ['computed-tasks'] }); + }, + onError: (error) => { + toast.error(`Failed to delete: ${error.message}`); + }, + }); + + const handleRefresh = () => { + void queryClient.invalidateQueries({ queryKey: ['computed-tasks'] }); + }; + + const outboxTasks = outboxQuery.data?.items ?? []; + const deadLetters = deadLettersQuery.data?.items ?? []; + + return ( +
+
+
+

Computed Update Tasks

+

+ Monitor and manage computed field update tasks +

+
+ +
+ +
+ + + Pending + + +
+ {outboxTasks.filter((t) => t.status === 'pending').length} +
+
+
+ + + Processing + + +
+ {outboxTasks.filter((t) => t.status === 'processing').length} +
+
+
+ + + Dead Letters + + +
{deadLetters.length}
+
+
+
+ + + + + Outbox + {outboxTasks.length > 0 && ( + + {outboxTasks.length} + + )} + + + Dead Letters + {deadLetters.length > 0 && ( + + {deadLetters.length} + + )} + + + + + + + Outbox Tasks + + Tasks waiting to be processed or currently processing + + + + {outboxQuery.isLoading ? ( +
+ +
+ ) : outboxTasks.length === 0 ? ( +
No pending tasks
+ ) : ( + + + + + Run ID + Status + Change Type + Seeds + Attempts + Next Run + Error + Actions + + + + {outboxTasks.map((task) => ( + + {task.runId} + + + + + {task.changeType} + + {task.seedCount} + + {task.attempts}/{task.maxAttempts} + + {formatDate(task.nextRunAt)} + + {task.lastError} + + + + + + ))} + +
+
+ )} +
+
+
+ + + + + Dead Letters + Failed tasks that exceeded maximum retry attempts + + + {deadLettersQuery.isLoading ? ( +
+ +
+ ) : deadLetters.length === 0 ? ( +
No dead letters
+ ) : ( + + + + + Run ID + Change Type + Seeds + Attempts + Failed At + Error + Actions + + + + {deadLetters.map((dl) => ( + + {dl.runId} + + {dl.changeType} + + {dl.seedCount} + {dl.attempts} + {formatDate(dl.failedAt)} + + {dl.lastError} + + +
+ {dl.traceData && ( + + )} + + +
+
+
+ ))} +
+
+
+ )} +
+
+
+
+ + setSelectedDeadLetter(null)}> + + + Task Trace + Run ID: {selectedDeadLetter?.runId} + + + + + + +
+ ); +} + +function TraceView({ traceData }: { traceData: TraceData | null }) { + if (!traceData) { + return
No trace data available
; + } + + return ( +
+
+
+ Request ID: + {traceData.requestId ?? 'N/A'} +
+
+ Total Duration: + {traceData.totalDurationMs ? `${traceData.totalDurationMs}ms` : 'N/A'} +
+
+ Attempts: + {traceData.attempts ?? 'N/A'} +
+
+ + {traceData.finalError && ( +
+

Final Error

+

{traceData.finalError}

+
+ )} + + {traceData.steps && traceData.steps.length > 0 && ( +
+

Execution Steps

+ + + + # + Level + Table ID + Field ID + Duration + Status + + + + {traceData.steps.map((step, index) => ( + + {step.stepIndex} + {step.level} + {step.tableId} + {step.fieldId} + {step.durationMs ? `${step.durationMs}ms` : '-'} + + {step.error ? ( + Error + ) : ( + + OK + + )} + + + ))} + +
+
+ )} +
+ ); +} diff --git a/apps/playground/src/components/playground/CreateTableDropdown.tsx b/apps/playground/src/components/playground/CreateTableDropdown.tsx new file mode 100644 index 0000000000..5fd2385480 --- /dev/null +++ b/apps/playground/src/components/playground/CreateTableDropdown.tsx @@ -0,0 +1,297 @@ +import type { TableTemplateDefinition } from '@teable/v2-table-templates'; +import type { VariantProps } from 'class-variance-authority'; +import { FileUp, Loader2, Plus } from 'lucide-react'; +import { useEffect, useMemo, useState } from 'react'; + +import { cn } from '@/lib/utils'; +import { Badge } from '@/components/ui/badge'; +import { Button, buttonVariants } from '@/components/ui/button'; +import { + Dialog, + DialogContent, + DialogDescription, + DialogFooter, + DialogHeader, + DialogTitle, + DialogTrigger, +} from '@/components/ui/dialog'; +import { ScrollArea } from '@/components/ui/scroll-area'; +import { Switch } from '@/components/ui/switch'; +import { ImportCsvDialog } from './ImportCsvDialog'; + +type CreateTableDropdownProps = { + templates: ReadonlyArray; + isCreating: boolean; + onSelect: (template: TableTemplateDefinition, options: { includeRecords: boolean }) => void; + onImportCsv?: (data: { tableName: string; csvData?: string; csvUrl?: string }) => Promise; + label?: string; + variant?: VariantProps['variant']; + size?: VariantProps['size']; + className?: string; +}; + +export function CreateTableDropdown({ + templates, + isCreating, + onSelect, + onImportCsv, + label = 'Create table', + variant = 'default', + size = 'sm', + className, +}: CreateTableDropdownProps) { + const [open, setOpen] = useState(false); + const [selectedKey, setSelectedKey] = useState(templates[0]?.key ?? ''); + const selectedTemplate = useMemo( + () => templates.find((template) => template.key === selectedKey) ?? templates[0] ?? null, + [selectedKey, templates] + ); + const [includeRecords, setIncludeRecords] = useState((templates[0]?.defaultRecordCount ?? 0) > 0); + const [seedSelectionLocked, setSeedSelectionLocked] = useState(false); + const [pendingClose, setPendingClose] = useState(false); + const [createStarted, setCreateStarted] = useState(false); + const isBusy = isCreating || pendingClose; + + useEffect(() => { + if (!templates.length) { + setSelectedKey(''); + return; + } + if (!selectedTemplate) { + setSelectedKey(templates[0]!.key); + } + }, [selectedTemplate, templates]); + + useEffect(() => { + if (!selectedTemplate) return; + if (!seedSelectionLocked) { + setIncludeRecords((selectedTemplate.defaultRecordCount ?? 0) > 0); + } + }, [seedSelectionLocked, selectedTemplate?.key]); + + useEffect(() => { + if (!pendingClose) { + if (createStarted) { + setCreateStarted(false); + } + return; + } + if (isCreating) { + if (!createStarted) { + setCreateStarted(true); + } + return; + } + if (createStarted) { + setOpen(false); + setPendingClose(false); + setCreateStarted(false); + } + }, [createStarted, isCreating, pendingClose]); + + const supportsRecords = (selectedTemplate?.defaultRecordCount ?? 0) > 0; + const selectedTables = selectedTemplate?.tables ?? []; + + const handleCreate = () => { + if (!selectedTemplate) return; + onSelect(selectedTemplate, { + includeRecords: includeRecords && supportsRecords, + }); + setPendingClose(true); + }; + + return ( +
+ { + if (isBusy) return; + setOpen(nextOpen); + if (!nextOpen) { + setPendingClose(false); + setCreateStarted(false); + } + }} + > + + + + + + Create table + + Pick a template and optionally seed it with example records. + + +
+
+
+ Templates +
+ +
+ {templates.map((template) => { + const selected = template.key === selectedTemplate?.key; + const seedCount = template.defaultRecordCount ?? 0; + const tableCount = template.tables.length; + return ( + + ); + })} +
+
+
+ +
+ +
+
+
+ {selectedTemplate?.name ?? 'Select a template'} +
+ {selectedTemplate ? ( +
+ {selectedTemplate.description} +
+ ) : null} +
+ +
+
Tables
+
+ {selectedTables.map((table) => ( +
+
+
+ {table.name} +
+
+ + {table.fieldCount} fields + + {table.defaultRecordCount > 0 ? ( + + {table.defaultRecordCount} records + + ) : null} +
+
+ {table.description ? ( +
+ {table.description} +
+ ) : null} +
+ ))} +
+
+ +
+
+
+
Seed records
+
+ {supportsRecords + ? 'Add sample records from the selected template.' + : 'This template ships without sample records.'} +
+
+ { + setSeedSelectionLocked(true); + setIncludeRecords(checked); + }} + disabled={!supportsRecords || isBusy} + /> +
+
+
+
+
+
+ + + + +
+
+ {onImportCsv && ( + + + Import CSV + + } + /> + )} +
+ ); +} diff --git a/apps/playground/src/components/playground/ExplainResultPanel.tsx b/apps/playground/src/components/playground/ExplainResultPanel.tsx new file mode 100644 index 0000000000..857c7095fe --- /dev/null +++ b/apps/playground/src/components/playground/ExplainResultPanel.tsx @@ -0,0 +1,1082 @@ +import { useMemo, useState, useCallback } from 'react'; +import type { IExplainResultDto } from '@teable/v2-contract-http'; +import { format } from 'sql-formatter'; +import { encode } from '@toon-format/toon'; +import { + ChevronDown, + ChevronRight, + Clock, + Database, + GitBranch, + AlertTriangle, + Zap, + Layers, + Lock, + Table2, + Copy, + Code, + LayoutDashboard, +} from 'lucide-react'; +import { cn } from '@/lib/utils'; +import { Badge } from '@/components/ui/badge'; +import { Button } from '@/components/ui/button'; +import { ScrollArea } from '@/components/ui/scroll-area'; +import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'; +import { + DropdownMenu, + DropdownMenuContent, + DropdownMenuItem, + DropdownMenuTrigger, +} from '@/components/ui/dropdown-menu'; +import { Tabs, TabsList, TabsTrigger } from '@/components/ui/tabs'; +import { getFieldTypeIcon } from '@/lib/fieldTypeIcons'; +import { + maskPlaygroundDbUrl, + readPlaygroundDbUrl, + readPlaygroundDbUrlFromEnv, +} from '@/lib/playground/databaseUrl'; +import { usePlaygroundEnvironment } from '@/lib/playground/environment'; + +interface ExplainResultPanelProps { + result: IExplainResultDto; + className?: string; +} + +type ComputedUpdateReason = NonNullable; + +function ComplexityScoreCard({ level, score }: { level: string; score: number }) { + const config: Record = { + trivial: { + bg: 'bg-gradient-to-br from-green-50 to-green-100', + border: 'border-green-200', + text: 'text-green-700', + label: 'Trivial', + }, + low: { + bg: 'bg-gradient-to-br from-blue-50 to-blue-100', + border: 'border-blue-200', + text: 'text-blue-700', + label: 'Low', + }, + medium: { + bg: 'bg-gradient-to-br from-yellow-50 to-yellow-100', + border: 'border-yellow-200', + text: 'text-yellow-700', + label: 'Medium', + }, + high: { + bg: 'bg-gradient-to-br from-orange-50 to-orange-100', + border: 'border-orange-200', + text: 'text-orange-700', + label: 'High', + }, + very_high: { + bg: 'bg-gradient-to-br from-red-50 to-red-100', + border: 'border-red-200', + text: 'text-red-700', + label: 'Very High', + }, + }; + + const c = config[level] ?? config.medium; + + return ( +
+
+ + + Complexity + +
+
{score}
+
{c.label}
+
+ ); +} + +function SqlBlock({ sql, parameters }: { sql: string; parameters: readonly unknown[] }) { + const [copied, setCopied] = useState(false); + + const formattedSql = useMemo(() => { + try { + // Skip formatting for comments + if (sql.startsWith('--')) { + return sql; + } + return format(sql, { + language: 'postgresql', + tabWidth: 2, + keywordCase: 'upper', + linesBetweenQueries: 1, + }); + } catch { + return sql; + } + }, [sql]); + + const handleCopy = useCallback(async () => { + const textToCopy = + parameters.length > 0 + ? `${formattedSql}\n\n-- Parameters: ${JSON.stringify(parameters)}` + : formattedSql; + await navigator.clipboard.writeText(textToCopy); + setCopied(true); + setTimeout(() => setCopied(false), 2000); + }, [formattedSql, parameters]); + + return ( +
+ + {copied && ( + Copied! + )} +
+
{formattedSql}
+
+ {parameters.length > 0 && ( +
+
Parameters:
+
+
{JSON.stringify(parameters)}
+
+
+ )} +
+ ); +} + +function ExplainOutputBlock({ + output, + isAnalyze, +}: { + output: { + plan: unknown; + planningTimeMs?: number; + executionTimeMs?: number; + actualRows?: number; + estimatedRows?: number; + estimatedCost?: number; + }; + isAnalyze: boolean; +}) { + const formattedPlan = useMemo(() => { + return JSON.stringify(output.plan, null, 2); + }, [output.plan]); + + return ( +
+
{formattedPlan}
+
+ {output.planningTimeMs !== undefined && ( + Planning: {output.planningTimeMs.toFixed(2)}ms + )} + {output.executionTimeMs !== undefined && ( + Execution: {output.executionTimeMs.toFixed(2)}ms + )} + {output.estimatedRows !== undefined && Est. rows: {output.estimatedRows}} + {output.actualRows !== undefined && isAnalyze && ( + Actual rows: {output.actualRows} + )} + {output.estimatedCost !== undefined && ( + Est. cost: {output.estimatedCost.toFixed(2)} + )} +
+
+ ); +} + +function ComputedReasonBlock({ reason }: { reason: ComputedUpdateReason }) { + const [open, setOpen] = useState(false); + + return ( +
+ + {open && ( +
+
+ {reason.notes.length > 0 && ( +
{reason.notes.join(' ')}
+ )} +
+
Triggered By
+
+ {reason.seedFields.length > 0 ? ( + reason.seedFields.map((seed) => { + const Icon = getFieldTypeIcon(seed.fieldType); + return ( +
+ + {seed.fieldName} + ({seed.fieldType}) + + {seed.impact === 'link_relation' ? 'link' : 'value'} + +
+ ); + }) + ) : ( + No seed fields + )} +
+
+
+
Updates
+
+ {reason.targetFields.length > 0 ? ( + reason.targetFields.map((target) => { + const Icon = getFieldTypeIcon(target.fieldType); + return ( +
+
+ + {target.fieldName} + ({target.fieldType}) +
+
+ {target.dependencies.length > 0 ? ( + target.dependencies.map((dep, index) => ( +
+ + {dep.fromTableName}.{dep.fromFieldName} + + ({dep.fromFieldType}) + + {dep.kind} + + {dep.semantic && ( + + {dep.semantic} + + )} + {dep.isSeed && ( + + seed + + )} +
+ )) + ) : ( + No direct dependencies + )} +
+
+ ); + }) + ) : ( + No computed targets + )} +
+
+
+
+ )} +
+ ); +} + +function StatCard({ + icon: Icon, + label, + value, + subValue, +}: { + icon: React.ElementType; + label: string; + value: string | number; + subValue?: string; +}) { + return ( +
+
+ + {label} +
+
{value}
+ {subValue &&
{subValue}
} +
+ ); +} + +function SqlStepCard({ + sqlInfo, + index, + computedLocks, +}: { + sqlInfo: IExplainResultDto['sqlExplains'][number]; + index: number; + computedLocks: IExplainResultDto['computedLocks']; +}) { + const [copied, setCopied] = useState(false); + + const handleCopyStep = useCallback(async () => { + const toonData = encode(sqlInfo); + await navigator.clipboard.writeText(toonData); + setCopied(true); + setTimeout(() => setCopied(false), 2000); + }, [sqlInfo]); + + return ( +
+
+
+ + Step {index + 1} + + + {sqlInfo.stepDescription} + + + + + + {copied ? 'Copied!' : 'Copy step (Toon)'} + +
+ {computedLocks && ( + + +
+ + + Stage locks: {computedLocks.mode} · {computedLocks.recordLockCount} records ·{' '} + {computedLocks.tableLockCount} tables + +
+
+ +
+
Computed update locks
+
{computedLocks.reason}
+
+ Applied once per update; the same lock set covers all steps. +
+
+
+
+ )} +
+
+
+
+
SQL
+ +
+ {(sqlInfo.explainError || + sqlInfo.explainAnalyze || + (sqlInfo.explainOnly && !sqlInfo.explainAnalyze)) && ( +
+ {sqlInfo.explainError ? ( +
+
+ +
+
Explain Error
+
{sqlInfo.explainError}
+
+
+
+ ) : sqlInfo.explainAnalyze ? ( + <> +
+ EXPLAIN ANALYZE +
+ + + ) : ( + <> +
EXPLAIN
+ + + )} +
+ )} +
+ {sqlInfo.computedReason && } +
+
+ ); +} + +function generateOptimizationPromptText(result: IExplainResultDto, dbUrl: string | null): string { + const lines: string[] = []; + + lines.push('# SQL Command EXPLAIN Analysis'); + lines.push(`DB URL: ${dbUrl ?? '(not set)'}`); + lines.push(''); + lines.push( + 'You are a database performance engineer. Analyze this execution plan and identify performance issues, bottlenecks, and concrete optimization steps.' + ); + lines.push( + 'Focus on expensive steps, sequential scans, row-estimate mismatch, missing indexes, lock contention, and computed field fan-out.' + ); + lines.push( + 'Return a prioritized list of issues, quick wins, deeper schema/index changes, and any trade-offs or risks.' + ); + lines.push(''); + + // Command Info + lines.push('## Command'); + lines.push(`- Type: ${result.command.type}`); + lines.push(`- Table: ${result.command.tableName}`); + lines.push(`- Change Type: ${result.command.changeType}`); + if (result.command.changedFieldNames && result.command.changedFieldNames.length > 0) { + lines.push(`- Changed Fields: ${result.command.changedFieldNames.join(', ')}`); + } + lines.push(''); + + // Complexity + lines.push('## Complexity Assessment'); + lines.push(`- Score: ${result.complexity.score}/100`); + lines.push(`- Level: ${result.complexity.level}`); + if (result.complexity.factors.length > 0) { + lines.push('- Factors:'); + for (const f of result.complexity.factors) { + lines.push(` - ${f.name}: ${f.value} (contribution: +${f.contribution})`); + } + } + if (result.complexity.recommendations.length > 0) { + lines.push('- Recommendations:'); + for (const rec of result.complexity.recommendations) { + lines.push(` - ${rec}`); + } + } + lines.push(''); + + // Computed Impact + if (result.computedImpact && result.computedImpact.updateSteps.length > 0) { + lines.push('## Computed Field Impact'); + lines.push(`- Seed Record Count: ${result.computedImpact.seedRecordCount}`); + lines.push( + `- Dependency Graph: ${result.computedImpact.dependencyGraph.fieldCount} fields, ${result.computedImpact.dependencyGraph.edgeCount} edges` + ); + lines.push('- Update Steps:'); + for (const step of result.computedImpact.updateSteps) { + lines.push( + ` - Level ${step.level}: ${step.tableName} - fields: ${step.fieldNames.join(', ')}` + ); + } + lines.push(''); + } + + // SQL Statements + if (result.sqlExplains.length > 0) { + lines.push('## SQL Statements'); + for (let i = 0; i < result.sqlExplains.length; i++) { + const sqlInfo = result.sqlExplains[i]; + lines.push(''); + lines.push(`### Step ${i + 1}: ${sqlInfo.stepDescription}`); + if (sqlInfo.explainError) { + lines.push(`Explain Error: ${sqlInfo.explainError}`); + } + lines.push('```sql'); + lines.push(sqlInfo.sql); + lines.push('```'); + if (sqlInfo.parameters.length > 0) { + lines.push(`Parameters: ${JSON.stringify(sqlInfo.parameters)}`); + } + if (sqlInfo.explainAnalyze) { + lines.push(''); + lines.push('EXPLAIN ANALYZE:'); + lines.push('```json'); + lines.push(JSON.stringify(sqlInfo.explainAnalyze.plan, null, 2)); + lines.push('```'); + if (sqlInfo.explainAnalyze.planningTimeMs !== undefined) { + lines.push(`- Planning Time: ${sqlInfo.explainAnalyze.planningTimeMs}ms`); + } + if (sqlInfo.explainAnalyze.executionTimeMs !== undefined) { + lines.push(`- Execution Time: ${sqlInfo.explainAnalyze.executionTimeMs}ms`); + } + } else if (sqlInfo.explainOnly) { + lines.push(''); + lines.push('EXPLAIN:'); + lines.push('```json'); + lines.push(JSON.stringify(sqlInfo.explainOnly.plan, null, 2)); + lines.push('```'); + } + } + lines.push(''); + } + + if (result.computedLocks) { + lines.push('## Computed Locks'); + lines.push(`- Mode: ${result.computedLocks.mode}`); + lines.push(`- Reason: ${result.computedLocks.reason}`); + lines.push(`- Record Locks: ${result.computedLocks.recordLockCount}`); + lines.push(`- Table Locks: ${result.computedLocks.tableLockCount}`); + lines.push(''); + } + + // Timing + lines.push('## Timing'); + lines.push(`- Total: ${result.timing.totalMs}ms`); + if (result.timing.dependencyGraphMs > 0) { + lines.push(`- Dependency Graph: ${result.timing.dependencyGraphMs}ms`); + } + if (result.timing.planningMs > 0) { + lines.push(`- Planning: ${result.timing.planningMs}ms`); + } + if (result.timing.sqlExplainMs > 0) { + lines.push(`- SQL Explain: ${result.timing.sqlExplainMs}ms`); + } + + return lines.join('\n'); +} + +function JsonViewPanel({ result }: { result: IExplainResultDto }) { + const [copied, setCopied] = useState(false); + + const formattedJson = useMemo(() => { + return JSON.stringify(result, null, 2); + }, [result]); + + const handleCopy = useCallback(async () => { + await navigator.clipboard.writeText(formattedJson); + setCopied(true); + setTimeout(() => setCopied(false), 2000); + }, [formattedJson]); + + return ( +
+
+
+ +

JSON Analysis Result

+
+ +
+ +
+          {formattedJson}
+        
+
+
+ ); +} + +export function ExplainResultPanel({ result, className }: ExplainResultPanelProps) { + const env = usePlaygroundEnvironment(); + const [viewMode, setViewMode] = useState<'visual' | 'json'>('visual'); + const [impactOpen, setImpactOpen] = useState(true); + const [locksOpen, setLocksOpen] = useState(false); + const [linkLocksOpen, setLinkLocksOpen] = useState(false); + const [copiedKey, setCopiedKey] = useState<'raw' | 'optimized' | null>(null); + + const totalSteps = result.computedImpact?.updateSteps.length ?? 0; + const totalRecords = + result.computedImpact?.affectedRecordEstimates.reduce((sum, e) => sum + e.estimatedCount, 0) ?? + 0; + + const resolveDbUrl = useCallback(() => { + if (env.kind === 'sandbox') return env.pgliteConnectionString; + return readPlaygroundDbUrl() ?? readPlaygroundDbUrlFromEnv(); + }, [env]); + + const displayDbUrl = useMemo(() => { + const dbUrl = resolveDbUrl(); + if (!dbUrl) return null; + return maskPlaygroundDbUrl(dbUrl); + }, [resolveDbUrl]); + + // Calculate total execution time from SQL explains + const totalExecutionTime = useMemo(() => { + let total = 0; + let hasData = false; + for (const sql of result.sqlExplains) { + if (sql.explainAnalyze?.executionTimeMs !== undefined) { + total += sql.explainAnalyze.executionTimeMs; + hasData = true; + } + } + return hasData ? total : null; + }, [result.sqlExplains]); + + const totalPlanningTime = useMemo(() => { + let total = 0; + let hasData = false; + for (const sql of result.sqlExplains) { + if (sql.explainAnalyze?.planningTimeMs !== undefined) { + total += sql.explainAnalyze.planningTimeMs; + hasData = true; + } + } + return hasData ? total : null; + }, [result.sqlExplains]); + + const explainErrorCount = useMemo( + () => result.sqlExplains.filter((sql) => sql.explainError).length, + [result.sqlExplains] + ); + + const copyWithFeedback = useCallback(async (text: string, key: 'raw' | 'optimized') => { + await navigator.clipboard.writeText(text); + setCopiedKey(key); + setTimeout(() => setCopiedKey(null), 2000); + }, []); + + const handleCopyRaw = useCallback(async () => { + const dbUrl = resolveDbUrl(); + const rawPayload = encode(result); + await copyWithFeedback(`DB URL: ${dbUrl ?? '(not set)'}\n${rawPayload}`, 'raw'); + }, [copyWithFeedback, resolveDbUrl, result]); + + const handleCopyOptimized = useCallback(async () => { + const dbUrl = resolveDbUrl(); + await copyWithFeedback(generateOptimizationPromptText(result, dbUrl), 'optimized'); + }, [copyWithFeedback, resolveDbUrl, result]); + + return ( +
+ {/* View Mode Tabs */} +
+ setViewMode(v as 'visual' | 'json')}> + + + + Visual + + + + JSON + + + +
+ + {viewMode === 'json' ? ( + + ) : ( +
+ {/* Left Panel - Overview */} + +
+ {/* Complexity Score */} + + + {/* Command Info */} +
+
+ + {result.command.type} +
+
+ + + {result.command.tableName} + +
+ {result.command.changedFieldNames && + result.command.changedFieldNames.length > 0 && ( +
+
Changed Fields
+
+ {result.command.changedFieldNames.map((name, i) => { + const fieldType = + result.command.changedFieldTypes?.[i] || 'singleLineText'; + const Icon = getFieldTypeIcon(fieldType); + return ( +
+ + {name} + ({fieldType}) +
+ ); + })} +
+
+ )} +
+ + {/* Stats Grid */} +
+ 0 + ? `SQL: ${result.timing.sqlExplainMs}ms` + : undefined + } + /> + 0 ? `~${totalRecords} records` : undefined} + /> + {explainErrorCount > 0 && ( + + )} +
+ + {/* Execution Time - only show when ANALYZE data is available */} + {totalExecutionTime !== null && ( +
+
+ + SQL Execution Time +
+
+ {totalExecutionTime.toFixed(2)}ms +
+ {totalPlanningTime !== null && ( +
+ Planning: {totalPlanningTime.toFixed(2)}ms +
+ )} +
+ {result.sqlExplains.map( + (sql, i) => + sql.explainAnalyze?.executionTimeMs !== undefined && ( +
+ + Step {i + 1} + + + {sql.explainAnalyze.executionTimeMs.toFixed(2)}ms + +
+ ) + )} +
+
+ )} + + {/* Recommendations */} + {result.complexity.recommendations.length > 0 && ( +
+
+ + Recommendations +
+
    + {result.complexity.recommendations.map((rec, i) => ( +
  • {rec}
  • + ))} +
+
+ )} + + {/* Complexity Factors */} + {result.complexity.factors.length > 0 && ( +
+
+ Complexity Factors +
+
+ {result.complexity.factors.map((f, i) => ( +
+ {f.name} +
+ {f.value} + + +{f.contribution} + +
+
+ ))} +
+
+ )} + + {/* Computed Impact */} + {result.computedImpact && result.computedImpact.updateSteps.length > 0 && ( +
+ + {impactOpen && ( +
+ {result.computedImpact.updateSteps.map((step, i) => ( +
+ + L{step.level} + +
+
{step.tableName}
+
+ {step.fieldNames.join(', ')} +
+
+
+ ))} +
+ )} +
+ )} + + {/* Computed Locks */} + {result.computedLocks && ( +
+ + {locksOpen && ( +
+
{result.computedLocks.reason}
+ {result.computedLocks.tableLocks.length > 0 && ( +
+
+ Table Locks +
+ {result.computedLocks.tableLocks.map((lock) => ( +
+ {lock.tableName} + {lock.key} +
+ ))} +
+ )} + {result.computedLocks.recordLocks.length > 0 && ( +
+
+ Record Locks +
+ {result.computedLocks.recordLocks.map((lock) => ( +
+ {lock.tableName} + + {lock.recordId} + +
+ ))} +
+ )} + {result.computedLocks.statements.length > 0 && ( +
+
+ Lock SQL +
+ {result.computedLocks.statements.map((statement, index) => ( +
+
+ {statement.tableName} + {statement.recordId ? ` · ${statement.recordId}` : ''} +
+ +
+ ))} +
+ )} +
+ )} +
+ )} + + {/* Link Record Locks */} + {result.linkLocks && result.linkLocks.mode === 'active' && ( +
+ + {linkLocksOpen && ( +
+
{result.linkLocks.reason}
+ {result.linkLocks.locks.length > 0 && ( +
+
+ Foreign Records +
+ {result.linkLocks.locks.map((lock) => ( +
+ + {lock.foreignTableName ?? lock.foreignTableId} + + + {lock.foreignRecordId} + + + {lock.key} + +
+ ))} +
+ )} + {result.linkLocks.sql && ( +
+
+ Lock SQL +
+ +
+ )} +
+ )} +
+ )} +
+
+ + {/* Right Panel - SQL */} + +
+
+
+
+ +

SQL Statements

+ + {result.sqlExplains.length} + +
+
+ DB URL:{' '} + {displayDbUrl ?? '(not set)'} +
+
+ + + + + + + + {copiedKey === 'raw' ? 'Copied raw!' : 'Copy raw (Toon)'} + + + + {copiedKey === 'optimized' ? 'Copied optimized!' : 'Copy optimized prompt'} + + + +
+ + {result.sqlExplains.length === 0 ? ( +
+ No SQL statements to display +
+ ) : ( +
+ {result.sqlExplains.map((sqlInfo, i) => ( + + ))} +
+ )} +
+
+
+ )} +
+ ); +} diff --git a/apps/playground/src/components/playground/FieldCreateDialog.tsx b/apps/playground/src/components/playground/FieldCreateDialog.tsx new file mode 100644 index 0000000000..e6beb956d5 --- /dev/null +++ b/apps/playground/src/components/playground/FieldCreateDialog.tsx @@ -0,0 +1,51 @@ +import { useQueryState, parseAsBoolean } from 'nuqs'; +import { Plus } from 'lucide-react'; +import { Button } from '@/components/ui/button'; +import { + Dialog, + DialogContent, + DialogDescription, + DialogHeader, + DialogTitle, + DialogTrigger, +} from '@/components/ui/dialog'; +import { FieldForm } from './FieldForm'; + +interface FieldCreateDialogProps { + baseId: string; + tableId: string; + onSuccess?: () => void; +} + +export function FieldCreateDialog({ baseId, tableId, onSuccess }: FieldCreateDialogProps) { + const [isOpen, setIsOpen] = useQueryState( + 'createField', + parseAsBoolean.withDefault(false).withOptions({ clearOnDefault: true }) + ); + + return ( + + + + + + + Create Field + Add a new field to your table. + + setIsOpen(false)} + onSuccess={() => { + setIsOpen(false); + onSuccess?.(); + }} + /> + + + ); +} diff --git a/apps/playground/src/components/playground/FieldForm.tsx b/apps/playground/src/components/playground/FieldForm.tsx new file mode 100644 index 0000000000..e55216491b --- /dev/null +++ b/apps/playground/src/components/playground/FieldForm.tsx @@ -0,0 +1,388 @@ +import { + useForm, + type ReactFormApi, + standardSchemaValidator, + type Validator, + type StandardSchemaV1, +} from '@tanstack/react-form'; +import { toast } from 'sonner'; +import { useMutation, useQuery } from '@tanstack/react-query'; +import { useRef } from 'react'; +import { createTanstackQueryUtils } from '@orpc/tanstack-query'; +import { + ROLLUP_FUNCTIONS, + TIME_ZONE_LIST, + checkFieldNotNullValidationEnabled, + checkFieldUniqueValidationEnabled, + isComputedFieldType, + type ITableFieldInput, + tableFieldInputSchema, +} from '@teable/v2-core'; +import type { IListTablesOkResponseDto, ITableDto } from '@teable/v2-contract-http'; +import { Button } from '@/components/ui/button'; +import { Input } from '@/components/ui/input'; +import { Label } from '@/components/ui/label'; +import { + Select, + SelectContent, + SelectItem, + SelectTrigger, + SelectValue, +} from '@/components/ui/select'; +import { useOrpcClient } from '@/lib/orpc/OrpcClientContext'; +import { FieldFormOptions } from './FieldFormOptions'; + +interface FieldFormProps { + baseId: string; + tableId: string; + onCancel: () => void; + onSuccess: () => void; +} + +type FieldOptionsValue = Extract['options']; +type FieldFormValues = Omit & { options?: FieldOptionsValue }; +type FieldFormValidator = Validator>; +type LinkFieldOptions = Extract['options']; +type RollupFieldConfig = Extract['config']; +type RollupFieldOptions = Extract['options']; +type LookupFieldOptions = Extract['options']; +type FieldType = ITableFieldInput['type']; + +export type FieldFormApi = ReactFormApi; + +export function FieldForm({ baseId, tableId, onCancel, onSuccess }: FieldFormProps) { + const orpc = createTanstackQueryUtils(useOrpcClient()); + const validatorAdapter = standardSchemaValidator() as FieldFormValidator; + + const tablesQuery = useQuery>( + orpc.tables.list.queryOptions({ + input: { baseId }, + select: (response) => response.data.tables, + }) + ); + + const createFieldMutation = useMutation( + orpc.tables.createField.mutationOptions({ + onSuccess: () => { + onSuccess(); + }, + onError: (error: any) => { + toast.error(error.message || 'Failed to create field'); + }, + }) + ); + + const typeDrafts = useRef< + Partial< + Record< + FieldType, + { + options?: FieldOptionsValue; + config?: RollupFieldConfig; + notNull?: boolean; + unique?: boolean; + } + > + > + >({}); + + const defaultLinkOptions = (): LinkFieldOptions => { + const tables = tablesQuery.data ?? []; + const candidates = tables.filter((table) => table.id !== tableId); + const target = candidates[0]; + if (!target) { + return { relationship: 'manyMany' } as any; + } + const lookupField = target.fields.find((field) => field.isPrimary) ?? target.fields[0]; + if (!lookupField) { + return { relationship: 'manyMany', foreignTableId: target.id } as any; + } + return { + relationship: 'manyMany', + foreignTableId: target.id, + lookupFieldId: lookupField.id, + } as any; + }; + + const defaultRollupConfig = (): RollupFieldConfig => { + const tables = tablesQuery.data ?? []; + const currentTable = tables.find((table) => table.id === tableId); + const linkField = currentTable?.fields.find((field) => field.type === 'link') as + | Extract + | undefined; + if (!linkField || linkField.type !== 'link') { + return {} as any; + } + const foreignTableId = linkField.options?.foreignTableId; + const foreignTable = tables.find((table) => table.id === foreignTableId); + const lookupField = + foreignTable?.fields.find((field) => field.isPrimary) ?? foreignTable?.fields[0]; + return { + linkFieldId: linkField.id, + foreignTableId: foreignTableId ?? '', + lookupFieldId: lookupField?.id ?? linkField.options?.lookupFieldId ?? '', + } as any; + }; + + const defaultRollupOptions = (): RollupFieldOptions => { + return { + expression: ROLLUP_FUNCTIONS[0], + timeZone: TIME_ZONE_LIST[0], + } as any; + }; + + const defaultLookupOptions = (): LookupFieldOptions => { + const tables = tablesQuery.data ?? []; + const currentTable = tables.find((table) => table.id === tableId); + const linkField = currentTable?.fields.find((field) => field.type === 'link') as + | Extract + | undefined; + if (!linkField || linkField.type !== 'link') { + return {} as any; + } + const foreignTableId = linkField.options?.foreignTableId; + const foreignTable = tables.find((table) => table.id === foreignTableId); + const lookupField = + foreignTable?.fields.find((field) => field.isPrimary) ?? foreignTable?.fields[0]; + return { + linkFieldId: linkField.id, + foreignTableId: foreignTableId ?? '', + lookupFieldId: lookupField?.id ?? linkField.options?.lookupFieldId ?? '', + } as any; + }; + + const defaultFormulaOptions = () => { + return { + expression: '1', + } as any; + }; + + const defaultConditionalRollupConfig = () => { + const tables = tablesQuery.data ?? []; + const candidates = tables.filter((table) => table.id !== tableId); + const foreignTable = candidates[0]; + const lookupField = + foreignTable?.fields.find((field) => field.isPrimary) ?? foreignTable?.fields[0]; + return { + foreignTableId: foreignTable?.id ?? '', + lookupFieldId: lookupField?.id ?? '', + condition: { filter: null }, + } as any; + }; + + const defaultConditionalRollupOptions = () => { + return { + expression: ROLLUP_FUNCTIONS[0], + timeZone: TIME_ZONE_LIST[0], + } as any; + }; + + const defaultConditionalLookupOptions = () => { + const tables = tablesQuery.data ?? []; + const candidates = tables.filter((table) => table.id !== tableId); + const foreignTable = candidates[0]; + const lookupField = + foreignTable?.fields.find((field) => field.isPrimary) ?? foreignTable?.fields[0]; + return { + foreignTableId: foreignTable?.id ?? '', + lookupFieldId: lookupField?.id ?? '', + condition: { filter: null }, + } as any; + }; + + const getDefaultValuesForType = ( + type: FieldType + ): { options?: FieldOptionsValue; config?: RollupFieldConfig } => { + switch (type) { + case 'link': + return { options: defaultLinkOptions() }; + case 'formula': + return { options: defaultFormulaOptions() }; + case 'rollup': + return { options: defaultRollupOptions(), config: defaultRollupConfig() }; + case 'lookup': + return { options: defaultLookupOptions() }; + case 'conditionalRollup': + return { + options: defaultConditionalRollupOptions(), + config: defaultConditionalRollupConfig(), + }; + case 'conditionalLookup': + return { options: defaultConditionalLookupOptions() }; + default: + return { options: {} }; + } + }; + + const form = useForm({ + defaultValues: { + type: 'singleLineText', + name: '', + options: {}, + } as FieldFormValues, + validatorAdapter, + validators: { + onChange: tableFieldInputSchema, + onBlur: tableFieldInputSchema, + }, + onSubmit: async ({ value }: { value: FieldFormValues }) => { + createFieldMutation.mutate({ + baseId, + tableId, + field: value, + } as any); + }, + }); + + return ( +
{ + e.preventDefault(); + e.stopPropagation(); + form.handleSubmit(); + }} + className="space-y-6" + > + ( +
+ + field.handleChange(e.target.value)} + placeholder="Enter field name" + /> + {field.state.meta.errors ? ( +

{field.state.meta.errors.join(', ')}

+ ) : null} +
+ )} + /> + + ( +
+ + +
+ )} + /> + + state.values.type} + children={(type) => ( + + )} + /> + +
+ + [state.canSubmit, state.isSubmitting] as const} + children={([canSubmit, isSubmitting]) => ( + + )} + /> +
+ + ); +} diff --git a/apps/playground/src/components/playground/FieldFormOptions.tsx b/apps/playground/src/components/playground/FieldFormOptions.tsx new file mode 100644 index 0000000000..42ca563e31 --- /dev/null +++ b/apps/playground/src/components/playground/FieldFormOptions.tsx @@ -0,0 +1,180 @@ +import { match } from 'ts-pattern'; +import { SingleLineTextOptions } from './field-options/SingleLineTextOptions'; +import { NumberOptions } from './field-options/NumberOptions'; +import { RatingOptions } from './field-options/RatingOptions'; +import { SelectOptions } from './field-options/SelectOptions'; +import { CheckboxOptions } from './field-options/CheckboxOptions'; +import { DateOptions } from './field-options/DateOptions'; +import { UserOptions } from './field-options/UserOptions'; +import { ButtonOptions } from './field-options/ButtonOptions'; +import { FormulaOptions } from './field-options/FormulaOptions'; +import { LinkOptions } from './field-options/LinkOptions'; +import { RollupOptions } from './field-options/RollupOptions'; +import { LookupOptions } from './field-options/LookupOptions'; +import { ConditionalRollupOptions } from './field-options/ConditionalRollupOptions'; +import { ConditionalLookupOptions } from './field-options/ConditionalLookupOptions'; +import type { FieldFormApi } from './FieldForm'; +import { + checkFieldNotNullValidationEnabled, + checkFieldUniqueValidationEnabled, + isComputedFieldType, + type ITableFieldInput, +} from '@teable/v2-core'; +import type { ITableDto } from '@teable/v2-contract-http'; +import { Label } from '@/components/ui/label'; +import { Switch } from '@/components/ui/switch'; + +interface FieldFormOptionsProps { + type: ITableFieldInput['type']; + form: FieldFormApi; + tableId: string; + tables: ReadonlyArray; + isTablesLoading: boolean; +} + +export function FieldFormOptions({ + type, + form, + tableId, + tables, + isTablesLoading, +}: FieldFormOptionsProps) { + const isComputed = isComputedFieldType(type); + const notNullEnabled = checkFieldNotNullValidationEnabled(type, { isComputed }); + const uniqueEnabled = checkFieldUniqueValidationEnabled(type, { isComputed }); + const validationHint = isComputed + ? 'Computed fields do not support not-null or unique validation.' + : 'No validation options for this field type.'; + + return ( +
+

Field Options

+ {match(type) + .with('singleLineText', () => ) + .with('longText', () => ( +

No options for long text.

+ )) + .with('number', () => ) + .with('rating', () => ) + .with('singleSelect', () => ) + .with('multipleSelect', () => ) + .with('checkbox', () => ) + .with('attachment', () => ( +

No options for attachment.

+ )) + .with('date', () => ) + .with('createdTime', () => ( +

No options for created time.

+ )) + .with('lastModifiedTime', () => ( +

No options for last modified time.

+ )) + .with('user', () => ) + .with('createdBy', () => ( +

No options for created by.

+ )) + .with('lastModifiedBy', () => ( +

No options for last modified by.

+ )) + .with('autoNumber', () => ( +

No options for auto number.

+ )) + .with('button', () => ) + .with('formula', () => ) + .with('link', () => ( + + )) + .with('rollup', () => ( + + )) + .with('lookup', () => ( + + )) + .with('conditionalRollup', () => ( + + )) + .with('conditionalLookup', () => ( + + )) + .exhaustive()} +
+

Validation

+ {notNullEnabled || uniqueEnabled ? ( +
+ {notNullEnabled ? ( + ( +
+
+ +

+ Require a value in every record. +

+
+ + field.handleChange(checked ? true : (undefined as any)) + } + /> +
+ )} + /> + ) : null} + {uniqueEnabled ? ( + ( +
+
+ +

+ Prevent duplicate values across records. +

+
+ + field.handleChange(checked ? true : (undefined as any)) + } + /> +
+ )} + /> + ) : null} +
+ ) : ( +

{validationHint}

+ )} +
+
+ ); +} diff --git a/apps/playground/src/components/playground/ImportCsvDialog.tsx b/apps/playground/src/components/playground/ImportCsvDialog.tsx new file mode 100644 index 0000000000..905c616854 --- /dev/null +++ b/apps/playground/src/components/playground/ImportCsvDialog.tsx @@ -0,0 +1,388 @@ +import { FileUp, Globe, Loader2, Upload } from 'lucide-react'; +import Papa from 'papaparse'; +import { useCallback, useRef, useState } from 'react'; + +import { Button } from '@/components/ui/button'; +import { + Dialog, + DialogContent, + DialogDescription, + DialogFooter, + DialogHeader, + DialogTitle, + DialogTrigger, +} from '@/components/ui/dialog'; +import { Input } from '@/components/ui/input'; +import { Label } from '@/components/ui/label'; +import { Tabs, TabsContent, TabsList, TabsTrigger } from '@/components/ui/tabs'; + +type ImportCsvDialogProps = { + onImport: (data: { tableName: string; csvData?: string; csvUrl?: string }) => Promise; + trigger?: React.ReactNode; +}; + +export function ImportCsvDialog({ onImport, trigger }: ImportCsvDialogProps) { + const [open, setOpen] = useState(false); + const [importMode, setImportMode] = useState<'file' | 'url'>('file'); + const [file, setFile] = useState(null); + const [csvUrl, setCsvUrl] = useState(''); + const [tableName, setTableName] = useState(''); + const [preview, setPreview] = useState<{ + headers: string[]; + rows: Record[]; + totalRows: number; + } | null>(null); + const [isImporting, setIsImporting] = useState(false); + const [isLoadingPreview, setIsLoadingPreview] = useState(false); + const [error, setError] = useState(null); + const fileInputRef = useRef(null); + + const reset = useCallback(() => { + setFile(null); + setCsvUrl(''); + setTableName(''); + setPreview(null); + setError(null); + if (fileInputRef.current) { + fileInputRef.current.value = ''; + } + }, []); + + const handleFileChange = useCallback((event: React.ChangeEvent) => { + const selectedFile = event.target.files?.[0]; + if (!selectedFile) return; + + setFile(selectedFile); + setError(null); + + // 自动从文件名生成表名 + const nameWithoutExt = selectedFile.name.replace(/\.csv$/i, ''); + setTableName(nameWithoutExt); + + // 解析 CSV 预览 + Papa.parse>(selectedFile, { + header: true, + skipEmptyLines: 'greedy', + preview: 5, // 只预览前 5 行 + complete: (results) => { + if (results.errors.length > 0) { + setError(`CSV parse error: ${results.errors[0].message}`); + return; + } + setPreview({ + headers: results.meta.fields ?? [], + rows: results.data, + totalRows: -1, // 未知总行数 + }); + }, + error: (err) => { + setError(`Failed to parse CSV: ${err.message}`); + }, + }); + }, []); + + const handleUrlPreview = useCallback(async () => { + if (!csvUrl.trim()) return; + + setIsLoadingPreview(true); + setError(null); + setPreview(null); + + try { + // Validate URL + new URL(csvUrl); + + // 尝试从 URL 获取预览(只获取前 5 行) + // 使用 PapaParse 的 preview 模式 + Papa.parse>(csvUrl, { + download: true, + header: true, + skipEmptyLines: 'greedy', + preview: 5, + complete: (results) => { + if (results.errors.length > 0) { + setError(`CSV parse error: ${results.errors[0].message}`); + setIsLoadingPreview(false); + return; + } + setPreview({ + headers: results.meta.fields ?? [], + rows: results.data, + totalRows: -1, + }); + + // 自动从 URL 生成表名 + if (!tableName) { + try { + const url = new URL(csvUrl); + const pathParts = url.pathname.split('/'); + const filename = pathParts[pathParts.length - 1] || 'imported'; + setTableName(filename.replace(/\.csv$/i, '')); + } catch { + setTableName('imported'); + } + } + setIsLoadingPreview(false); + }, + error: (err) => { + setError(`Failed to load CSV from URL: ${err.message}`); + setIsLoadingPreview(false); + }, + }); + } catch { + setError('Invalid URL format'); + setIsLoadingPreview(false); + } + }, [csvUrl, tableName]); + + const handleImport = useCallback(async () => { + if (!tableName.trim()) return; + + if (importMode === 'file' && !file) return; + if (importMode === 'url' && !csvUrl.trim()) return; + + setIsImporting(true); + setError(null); + + try { + if (importMode === 'file' && file) { + // Read file as text + const csvData = await file.text(); + if (!csvData.trim()) { + throw new Error('CSV file is empty'); + } + + await onImport({ + tableName: tableName.trim(), + csvData, + }); + } else if (importMode === 'url') { + // 传递 URL,让后端流式处理 + await onImport({ + tableName: tableName.trim(), + csvUrl: csvUrl.trim(), + }); + } + + setOpen(false); + reset(); + } catch (err) { + setError(err instanceof Error ? err.message : 'Import failed'); + } finally { + setIsImporting(false); + } + }, [file, csvUrl, tableName, importMode, onImport, reset]); + + const handleOpenChange = useCallback( + (nextOpen: boolean) => { + setOpen(nextOpen); + if (!nextOpen) { + reset(); + } + }, + [reset] + ); + + return ( + + + {trigger ?? ( + + )} + + + + Import from CSV + + Upload a CSV file to create a new table. All columns will be created as text fields. + + + +
+ { + setImportMode(v as 'file' | 'url'); + reset(); + }} + > + + + + File Upload + + + + From URL + + + + + {/* File Input */} +
+ +
fileInputRef.current?.click()} + > + +
+ + {file ? ( +
+

{file.name}

+

+ {(file.size / 1024).toFixed(1)} KB +

+
+ ) : ( +
+

Click to upload CSV file

+

or drag and drop

+
+ )} +
+
+
+ + {/* Table Name Input for File */} + {file && ( +
+ + setTableName(e.target.value)} + placeholder="Enter table name" + /> +
+ )} +
+ + + {/* URL Input */} +
+ +
+ setCsvUrl(e.target.value)} + placeholder="https://example.com/data.csv" + className="flex-1" + /> + +
+

+ Enter a public URL to a CSV file. Large files will be streamed. +

+
+ + {/* Table Name Input for URL */} + {preview && ( +
+ + setTableName(e.target.value)} + placeholder="Enter table name" + /> +
+ )} +
+
+ + {/* Preview */} + {preview && ( +
+ +
+ + + + {preview.headers.map((header, i) => ( + + ))} + + + + {preview.rows.map((row, rowIndex) => ( + + {preview.headers.map((header, colIndex) => ( + + ))} + + ))} + +
+ {header || `Column ${i + 1}`} +
+ {row[header] || '-'} +
+
+

Showing first 5 rows

+
+ )} + + {/* Error */} + {error && ( +
+ {error} +
+ )} +
+ + + + + +
+
+ ); +} diff --git a/apps/playground/src/components/playground/LinkFieldLabel.tsx b/apps/playground/src/components/playground/LinkFieldLabel.tsx new file mode 100644 index 0000000000..0e8b7637d9 --- /dev/null +++ b/apps/playground/src/components/playground/LinkFieldLabel.tsx @@ -0,0 +1,51 @@ +import type { LinkRelationshipValue } from '@teable/v2-core'; +import { Badge } from '@/components/ui/badge'; +import { cn } from '@/lib/utils'; + +type LinkFieldLabelProps = { + name: string; + fieldId: string; + relationship: string; + isOneWay?: boolean; + className?: string; + badgeClassName?: string; +}; + +const relationshipLabels: Record = { + manyMany: 'many-many', + oneMany: 'one-many', + manyOne: 'many-one', + oneOne: 'one-one', +}; + +const formatRelationshipLabel = (relationship: string): string => + relationshipLabels[relationship as LinkRelationshipValue] ?? relationship; + +export function LinkFieldLabel({ + name, + fieldId, + relationship, + isOneWay = false, + className, + badgeClassName, +}: LinkFieldLabelProps) { + const relationshipLabel = formatRelationshipLabel(relationship); + const directionLabel = isOneWay ? 'one-way' : 'two-way'; + const badgeClasses = cn('h-4 px-1 text-[9px] font-normal uppercase', badgeClassName); + + return ( + + {name} + + {relationshipLabel} + + + {directionLabel} + + + ); +} diff --git a/apps/playground/src/components/playground/LogPanel.tsx b/apps/playground/src/components/playground/LogPanel.tsx new file mode 100644 index 0000000000..3af929033e --- /dev/null +++ b/apps/playground/src/components/playground/LogPanel.tsx @@ -0,0 +1,428 @@ +import { useCallback, useEffect, useMemo, useRef, useState } from 'react'; +import { + AlertCircle, + AlertTriangle, + Bug, + ChevronDown, + ChevronUp, + Circle, + Info, + Maximize2, + Minimize2, + Pause, + Play, + Search, + Terminal, + Trash2, + Wifi, + WifiOff, + X, +} from 'lucide-react'; +import { parseAsBoolean, useQueryState } from 'nuqs'; +import { Badge } from '@/components/ui/badge'; +import { Button } from '@/components/ui/button'; +import { Input } from '@/components/ui/input'; +import { ScrollArea } from '@/components/ui/scroll-area'; +import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip'; +import { cn } from '@/lib/utils'; +import { useLogStream, type LogEntry, type LogLevel } from '@/hooks/useLogStream'; + +type LogLevelConfig = { + icon: typeof Info; + className: string; + badgeClassName: string; + label: string; +}; + +const LOG_LEVEL_CONFIG: Record = { + debug: { + icon: Bug, + className: 'text-slate-500', + badgeClassName: 'bg-slate-500/10 text-slate-600 border-slate-500/30', + label: 'DEBUG', + }, + info: { + icon: Info, + className: 'text-blue-500', + badgeClassName: 'bg-blue-500/10 text-blue-600 border-blue-500/30', + label: 'INFO', + }, + warn: { + icon: AlertTriangle, + className: 'text-amber-500', + badgeClassName: 'bg-amber-500/10 text-amber-600 border-amber-500/30', + label: 'WARN', + }, + error: { + icon: AlertCircle, + className: 'text-red-500', + badgeClassName: 'bg-red-500/10 text-red-600 border-red-500/30', + label: 'ERROR', + }, +}; + +const ALL_LEVELS: ReadonlyArray = ['debug', 'info', 'warn', 'error']; + +type PanelSize = 'normal' | 'large'; + +const PANEL_SIZE_CONFIG: Record = { + normal: { width: 'w-[600px]', height: 'h-[320px]' }, + large: { width: 'w-[900px]', height: 'h-[500px]' }, +}; + +const LOG_PANEL_SIZE_KEY = 'teable-log-panel-size'; + +const readStoredSize = (): PanelSize => { + if (typeof window === 'undefined') return 'normal'; + const stored = localStorage.getItem(LOG_PANEL_SIZE_KEY); + if (stored === 'large' || stored === 'normal') return stored; + return 'normal'; +}; + +const storeSize = (size: PanelSize): void => { + if (typeof window === 'undefined') return; + localStorage.setItem(LOG_PANEL_SIZE_KEY, size); +}; + +type LogPanelProps = { + className?: string; + defaultLevels?: ReadonlyArray; +}; + +export function LogPanel({ + className, + defaultLevels = ['debug', 'info', 'warn', 'error'], +}: LogPanelProps) { + // Use URL query param for expanded state + const [expanded, setExpanded] = useQueryState('logs', parseAsBoolean.withDefault(false)); + + // Use localStorage for panel size + const [panelSize, setPanelSize] = useState(() => readStoredSize()); + const [enabledLevels, setEnabledLevels] = useState>(new Set(defaultLevels)); + const [searchQuery, setSearchQuery] = useState(''); + const [autoScroll, setAutoScroll] = useState(true); + const scrollRef = useRef(null); + + const sizeConfig = PANEL_SIZE_CONFIG[panelSize]; + const isLarge = panelSize === 'large'; + + const toggleSize = useCallback(() => { + setPanelSize((prev) => { + const next = prev === 'normal' ? 'large' : 'normal'; + storeSize(next); + return next; + }); + }, []); + + const { logs, status, paused, pause, resume, clear } = useLogStream({ + enabled: expanded, + }); + + // Filter logs by level and search query + const filteredLogs = useMemo(() => { + return logs.filter((log) => { + if (!enabledLevels.has(log.level)) return false; + if (searchQuery.trim()) { + const query = searchQuery.toLowerCase(); + const matchesMessage = log.message.toLowerCase().includes(query); + const matchesContext = log.context + ? JSON.stringify(log.context).toLowerCase().includes(query) + : false; + if (!matchesMessage && !matchesContext) return false; + } + return true; + }); + }, [logs, enabledLevels, searchQuery]); + + // Count logs by level + const levelCounts = useMemo(() => { + const counts: Record = { debug: 0, info: 0, warn: 0, error: 0 }; + for (const log of logs) { + counts[log.level]++; + } + return counts; + }, [logs]); + + // Auto-scroll to bottom + useEffect(() => { + if (autoScroll && scrollRef.current) { + const viewport = scrollRef.current.querySelector('[data-slot="scroll-area-viewport"]'); + if (viewport) { + viewport.scrollTop = viewport.scrollHeight; + } + } + }, [filteredLogs, autoScroll]); + + const toggleLevel = useCallback((level: LogLevel) => { + setEnabledLevels((prev) => { + const next = new Set(prev); + if (next.has(level)) { + next.delete(level); + } else { + next.add(level); + } + return next; + }); + }, []); + + const isConnected = status === 'connected'; + + if (!expanded) { + return ( +
+ + + + + Open log panel + +
+ ); + } + + return ( +
+ {/* Header */} +
+
+ + Logs + + {filteredLogs.length} + +
+ +
+ + {/* Connection Status */} + + +
+ {isConnected ? ( + <> + + + + ) : ( + + )} +
+
+ + {isConnected ? 'Connected to log stream' : `Status: ${status}`} + +
+ + {/* Actions */} +
+ + + + + {paused ? 'Resume' : 'Pause'} + + + + + + + Clear logs + + + + + + + {autoScroll ? 'Auto-scroll on' : 'Auto-scroll off'} + + + + + + + {isLarge ? 'Shrink panel' : 'Expand panel'} + + + +
+
+ + {/* Filters */} +
+ {/* Search */} +
+ + setSearchQuery(e.target.value)} + className="h-7 pl-7 text-xs" + /> +
+ + {/* Level Filters */} +
+ {ALL_LEVELS.map((level) => { + const config = LOG_LEVEL_CONFIG[level]; + const Icon = config.icon; + const isActive = enabledLevels.has(level); + const count = levelCounts[level]; + + return ( + + + + + {isActive ? `Hide ${level}` : `Show ${level}`} + + ); + })} +
+
+ + {/* Log List */} + +
+ {filteredLogs.length === 0 ? ( +
+ + No logs to display + {!isConnected && Waiting for connection...} +
+ ) : ( + filteredLogs.map((log) => ) + )} +
+
+ + {/* Footer */} + {paused && ( +
+ + Log reception paused +
+ )} +
+ ); +} + +type LogRowProps = { + log: LogEntry; +}; + +function LogRow({ log }: LogRowProps) { + const [expanded, setExpanded] = useState(false); + const config = LOG_LEVEL_CONFIG[log.level]; + const Icon = config.icon; + const hasContext = log.context && Object.keys(log.context).length > 0; + + const timestamp = useMemo(() => { + const date = new Date(log.timestamp); + return date.toLocaleTimeString('en-US', { + hour12: false, + hour: '2-digit', + minute: '2-digit', + second: '2-digit', + fractionalSecondDigits: 3, + }); + }, [log.timestamp]); + + return ( +
+
+ + {timestamp} + + {config.label} + + {log.message} + {hasContext && ( + + )} +
+ {expanded && hasContext && ( +
+
+            {JSON.stringify(log.context, null, 2)}
+          
+
+ )} +
+ ); +} diff --git a/apps/playground/src/components/playground/MetaCheckPanel.tsx b/apps/playground/src/components/playground/MetaCheckPanel.tsx new file mode 100644 index 0000000000..4d003c53ba --- /dev/null +++ b/apps/playground/src/components/playground/MetaCheckPanel.tsx @@ -0,0 +1,481 @@ +import { useCallback, useEffect, useRef, useState } from 'react'; + +import { PLAYGROUND_DB_URL_QUERY_PARAM, readPlaygroundDbUrl } from '@/lib/playground/databaseUrl'; +import { + CheckCircle2, + XCircle, + AlertTriangle, + Loader2, + Play, + RefreshCcw, + FileSearch, + Link2, + Table2, +} from 'lucide-react'; + +import { Badge } from '@/components/ui/badge'; +import { Button } from '@/components/ui/button'; +import { LinkFieldLabel } from '@/components/playground/LinkFieldLabel'; +import { getFieldTypeIcon } from '@/lib/fieldTypeIcons'; +import { cn } from '@/lib/utils'; + +export type MetaValidationSeverity = 'error' | 'warning' | 'info'; +export type MetaValidationCategory = 'schema' | 'reference'; + +export interface MetaValidationIssue { + fieldId: string; + fieldName: string; + fieldType: string; + category: MetaValidationCategory; + severity: MetaValidationSeverity; + message: string; + details?: { + path?: string; + expected?: string; + received?: string; + relatedTableId?: string; + relatedFieldId?: string; + }; +} + +interface MetaCheckSSEResult { + id: string; + type: 'connect' | 'issue' | 'complete' | 'error'; + issue?: MetaValidationIssue; + message?: string; + timestamp: number; +} + +type FieldMeta = { + id: string; + name: string; + type: string; + relationship?: string; + isOneWay?: boolean; +}; + +type MetaCheckPanelProps = { + tableId: string; + tableName: string; + fields?: ReadonlyArray; +}; + +const SeverityIcon = ({ severity }: { severity: MetaValidationSeverity }) => { + switch (severity) { + case 'error': + return ; + case 'warning': + return ; + case 'info': + default: + return ; + } +}; + +const SeverityBadge = ({ severity }: { severity: MetaValidationSeverity }) => { + const variants: Record< + MetaValidationSeverity, + 'default' | 'secondary' | 'destructive' | 'outline' + > = { + error: 'destructive', + warning: 'outline', + info: 'secondary', + }; + + const labels: Record = { + error: 'Error', + warning: 'Warning', + info: 'Info', + }; + + return ( + + {labels[severity]} + + ); +}; + +const CategoryIcon = ({ category }: { category: MetaValidationCategory }) => { + switch (category) { + case 'reference': + return ; + case 'schema': + default: + return ; + } +}; + +/** + * Renders a single issue result. + */ +const IssueResultItem = ({ issue }: { issue: MetaValidationIssue }) => { + return ( +
+ +
+
+ + {issue.category} + +
+
+ {issue.message} +
+ {issue.details && ( +
+ {issue.details.path && ( +
+ Path: {issue.details.path} +
+ )} + {issue.details.relatedTableId && ( +
+ Related Table:{' '} + {issue.details.relatedTableId} +
+ )} + {issue.details.relatedFieldId && ( +
+ Related Field:{' '} + {issue.details.relatedFieldId} +
+ )} + {issue.details.expected && ( +
+ Expected: {issue.details.expected} +
+ )} + {issue.details.received && ( +
+ Received: {issue.details.received} +
+ )} +
+ )} +
+
+ ); +}; + +export function MetaCheckPanel({ tableId, tableName, fields }: MetaCheckPanelProps) { + const [issues, setIssues] = useState([]); + const [isRunning, setIsRunning] = useState(false); + const [hasRun, setHasRun] = useState(false); + const eventSourceRef = useRef(null); + + const stopCheck = useCallback(() => { + if (eventSourceRef.current) { + eventSourceRef.current.close(); + eventSourceRef.current = null; + } + setIsRunning(false); + }, []); + + const startCheck = useCallback(() => { + stopCheck(); + setIssues([]); + setIsRunning(true); + setHasRun(true); + + const dbUrl = readPlaygroundDbUrl(); + const baseUrl = `/api/meta/${tableId}/check/stream`; + const eventSourceUrl = dbUrl + ? `${baseUrl}?${new URLSearchParams({ + [PLAYGROUND_DB_URL_QUERY_PARAM]: dbUrl, + }).toString()}` + : baseUrl; + const eventSource = new EventSource(eventSourceUrl); + eventSourceRef.current = eventSource; + + eventSource.onmessage = (event) => { + try { + const result = JSON.parse(event.data) as MetaCheckSSEResult; + + // Skip connection message + if (result.type === 'connect') { + return; + } + + if (result.type === 'complete') { + setIsRunning(false); + eventSource.close(); + return; + } + + if (result.type === 'error') { + // Add error as an issue + setIssues((prev) => [ + ...prev, + { + fieldId: '', + fieldName: '', + fieldType: '', + category: 'schema', + severity: 'error', + message: result.message || 'Unknown error', + }, + ]); + setIsRunning(false); + eventSource.close(); + return; + } + + if (result.type === 'issue' && result.issue) { + setIssues((prev) => [...prev, result.issue!]); + } + } catch (e) { + console.error('Failed to parse SSE message:', e); + } + }; + + eventSource.onerror = () => { + setIsRunning(false); + eventSource.close(); + }; + }, [tableId, stopCheck]); + + // Cleanup on unmount + useEffect(() => { + return () => { + if (eventSourceRef.current) { + eventSourceRef.current.close(); + } + }; + }, []); + + // Reset when tableId changes + const hasStartedRef = useRef(false); + useEffect(() => { + hasStartedRef.current = false; + setIssues([]); + setHasRun(false); + stopCheck(); + }, [tableId, stopCheck]); + + // Auto-start when entering the tab + useEffect(() => { + if (!hasStartedRef.current && !isRunning && !hasRun) { + hasStartedRef.current = true; + const timer = setTimeout(() => { + startCheck(); + }, 100); + return () => clearTimeout(timer); + } + }, [tableId, isRunning, hasRun, startCheck]); + + // Group issues by field + const groupedIssues = issues.reduce>((acc, issue) => { + const key = issue.fieldId || 'system'; + if (!acc[key]) { + acc[key] = []; + } + acc[key].push(issue); + return acc; + }, {}); + + // Summary counts + const summary = { + total: issues.length, + error: issues.filter((i) => i.severity === 'error').length, + warning: issues.filter((i) => i.severity === 'warning').length, + info: issues.filter((i) => i.severity === 'info').length, + }; + + // Get field meta by ID + const getFieldMeta = (fieldId: string): FieldMeta | undefined => { + return fields?.find((f) => f.id === fieldId); + }; + + return ( +
+
+
+ + Meta Check + {hasRun && ( + <> + {summary.total === 0 ? ( + + ✓ All valid + + ) : ( + <> + + {summary.total} issues + + {summary.error > 0 && ( + + ✗ {summary.error} + + )} + {summary.warning > 0 && ( + + ⚠ {summary.warning} + + )} + + )} + + )} +
+
+ {isRunning ? ( + + ) : ( + + )} +
+
+ + {!hasRun ? ( +
+
+ Click "Start Check" to validate the meta data for table{' '} + {tableName}. +
+
+ This will check field configurations including link references, lookup dependencies, and + more. +
+
+ ) : ( +
+ {/* Show success message if no issues */} + {!isRunning && issues.length === 0 && hasRun && ( +
+ +
+ All meta data is valid +
+
+ No issues found in field configurations. +
+
+ )} + + {/* Group issues by field */} + {Object.entries(groupedIssues).map(([fieldId, fieldIssues]) => { + const isSystemField = fieldId === 'system'; + const fieldMeta = getFieldMeta(fieldId); + const fieldName = + fieldMeta?.name || fieldIssues[0]?.fieldName || (isSystemField ? 'System' : fieldId); + const fieldType = fieldMeta?.type || fieldIssues[0]?.fieldType; + const fieldRelationship = fieldMeta?.relationship; + const FieldIcon = fieldType ? getFieldTypeIcon(fieldType) : null; + const hasError = fieldIssues.some((i) => i.severity === 'error'); + const hasWarning = fieldIssues.some((i) => i.severity === 'warning'); + + return ( +
+
+ {hasError ? ( + + ) : hasWarning ? ( + + ) : ( + + )} + {FieldIcon ? : null} + {fieldType === 'link' && fieldRelationship && fieldId !== 'system' ? ( + + ) : ( + {fieldName} + )} + {fieldType ? ( + + {fieldType} + + ) : null} + + {fieldId !== 'system' && fieldId ? `(${fieldId})` : ''} + +
+ +
+ {fieldIssues.map((issue, index) => ( + + ))} +
+
+ ); + })} + + {isRunning && issues.length === 0 && ( +
+ + Checking meta data... +
+ )} +
+ )} +
+ ); +} diff --git a/apps/playground/src/components/playground/PlaygroundRecordRoute.tsx b/apps/playground/src/components/playground/PlaygroundRecordRoute.tsx new file mode 100644 index 0000000000..9eb5eaae0f --- /dev/null +++ b/apps/playground/src/components/playground/PlaygroundRecordRoute.tsx @@ -0,0 +1,490 @@ +import { createTanstackQueryUtils } from '@orpc/tanstack-query'; +import { keepPreviousData, useQuery, useQueryClient } from '@tanstack/react-query'; +import { useNavigate } from '@tanstack/react-router'; +import { mapTableDtoToDomain, type ITableRecordDto } from '@teable/v2-contract-http'; +import type { + Field, + ITableRecordRealtimeDTO, + LinkField, + Table as TableAggregate, +} from '@teable/v2-core'; + +import { formatRecordValue } from '@/components/playground/recordValueVisitor'; +import { LinkFieldLabel } from '@/components/playground/LinkFieldLabel'; +import { getFieldTypeIcon } from '@/lib/fieldTypeIcons'; +import { ArrowLeft, Pencil, TriangleAlert, Radio } from 'lucide-react'; +import { useEffect, useMemo, useState, type ReactNode } from 'react'; + +import { RecordUpdateDialog } from '@/components/playground/RecordUpdateDialog'; +import { Badge } from '@/components/ui/badge'; +import { Button } from '@/components/ui/button'; +import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card'; +import { ScrollArea } from '@/components/ui/scroll-area'; +import { + Table as UITable, + TableBody, + TableCell, + TableHead, + TableHeader, + TableRow, +} from '@/components/ui/table'; +import { useBroadcastChannelDoc } from '@/lib/broadcastChannel'; +import { useOrpcClient } from '@/lib/orpc/OrpcClientContext'; +import { usePlaygroundEnvironment } from '@/lib/playground/environment'; +import { useShareDbDoc, type ShareDbDocStatus } from '@/lib/shareDb'; + +const getErrorMessage = (error: unknown, fallback: string): string => { + if (error instanceof Error) return error.message; + if (typeof error === 'string') return error; + return fallback; +}; + +const isEmptyRecordValue = (value: unknown): boolean => + value === undefined || + value === null || + value === '' || + (Array.isArray(value) && value.length === 0); + +type LinkValueItem = { + id: string; + label: string; +}; + +const resolveLinkLabel = (value: unknown): string | null => { + if (value === undefined || value === null) return null; + if (typeof value === 'string') return value; + if (typeof value === 'object') { + const candidate = value as { title?: unknown; name?: unknown; id?: unknown }; + if (typeof candidate.title === 'string') return candidate.title; + if (typeof candidate.name === 'string') return candidate.name; + if (typeof candidate.id === 'string') return candidate.id; + } + return null; +}; + +const resolveLinkId = (value: unknown): string | null => { + if (typeof value === 'string') return value; + if (typeof value === 'object' && value !== null) { + const candidate = value as { id?: unknown }; + if (typeof candidate.id === 'string') return candidate.id; + } + return null; +}; + +const extractLinkValues = (value: unknown): LinkValueItem[] => { + if (isEmptyRecordValue(value)) return []; + const values = Array.isArray(value) ? value : [value]; + return values + .map((entry) => { + const id = resolveLinkId(entry); + if (!id) return null; + const label = resolveLinkLabel(entry) ?? id; + return { id, label }; + }) + .filter((entry): entry is LinkValueItem => Boolean(entry)); +}; + +type PlaygroundRecordRouteProps = { + baseId: string; + tableId: string; + recordId: string; +}; + +export function PlaygroundRecordRoute({ baseId, tableId, recordId }: PlaygroundRecordRouteProps) { + const env = usePlaygroundEnvironment(); + const navigate = useNavigate(); + const orpcClient = useOrpcClient(); + const orpc = createTanstackQueryUtils(orpcClient); + const queryClient = useQueryClient(); + + const tableQuery = useQuery( + orpc.tables.getById.queryOptions({ + input: { baseId, tableId }, + placeholderData: keepPreviousData, + select: (response) => response.data.table, + }) + ); + + const recordQuery = useQuery( + orpc.tables.getRecord.queryOptions({ + input: { tableId, recordId }, + enabled: Boolean(recordId), + placeholderData: keepPreviousData, + select: (response) => response.data.record, + }) + ); + + // Realtime subscription for single record + const isSandbox = env.kind === 'sandbox'; + const realtimeRecordCollection = useMemo(() => `rec_${tableId}`, [tableId]); + + const shareDbRecord = useShareDbDoc({ + collection: realtimeRecordCollection, + docId: recordId, + enabled: !isSandbox && !!tableId && !!recordId, + }); + + const broadcastRecord = useBroadcastChannelDoc({ + collection: realtimeRecordCollection, + docId: recordId, + enabled: isSandbox && !!tableId && !!recordId, + }); + + const realtimeRecord = isSandbox ? broadcastRecord : shareDbRecord; + + // Sync realtime data to TanStack Query cache + useEffect(() => { + if (!realtimeRecord.data) return; + + const queryKey = orpc.tables.getRecord.queryOptions({ + input: { tableId, recordId }, + }).queryKey; + + type RecordQueryData = { ok: true; data: { record: ITableRecordDto } }; + + queryClient.setQueryData(queryKey, (oldData) => { + if (!oldData?.data?.record) return oldData; + + // Merge realtime fields into cached record + return { + ...oldData, + data: { + ...oldData.data, + record: { + ...oldData.data.record, + fields: { + ...oldData.data.record.fields, + ...realtimeRecord.data!.fields, + }, + }, + }, + }; + }); + }, [realtimeRecord.data, queryClient, orpc, tableId, recordId]); + + console.log('[PlaygroundRecordRoute] render', { + realtimeRecordData: realtimeRecord.data, + realtimeRecordStatus: realtimeRecord.status, + }); + + const tableResult = useMemo( + () => (tableQuery.data ? mapTableDtoToDomain(tableQuery.data) : null), + [tableQuery.data] + ); + const table = tableResult?.isOk() ? tableResult.value : null; + const mappingError = tableResult?.isErr() ? tableResult.error.message : null; + const record = recordQuery.data ?? null; + const [isUpdateOpen, setIsUpdateOpen] = useState(false); + + const errorMessage = (() => { + if (mappingError) return mappingError; + if (tableQuery.error) return getErrorMessage(tableQuery.error, 'Failed to load table'); + if (recordQuery.error) return getErrorMessage(recordQuery.error, 'Failed to load record'); + return null; + })(); + + const isLoading = tableQuery.isLoading || recordQuery.isLoading; + + const sortedFields = useMemo(() => { + if (!table) return [] as Field[]; + const primaryFieldId = table.primaryFieldId().toString(); + return [...table.getFields()].sort((a, b) => { + const aIsPrimary = a.id().toString() === primaryFieldId; + const bIsPrimary = b.id().toString() === primaryFieldId; + if (aIsPrimary) return -1; + if (bIsPrimary) return 1; + return 0; + }); + }, [table]); + + const handleBack = () => { + void navigate({ + to: env.routes.table, + params: { baseId, tableId }, + search: (prev) => prev, + }); + }; + + const resolveRecordHref = (targetBaseId: string, targetTableId: string, linkedRecordId: string) => + env.routes.record + .replace('$baseId', targetBaseId) + .replace('$tableId', targetTableId) + .replace('$recordId', linkedRecordId); + + return ( +
+
+
+ +
+
+ Record detail + +
+
{recordId}
+
+
+
+ {table && record ? ( + + ) : null} + +
+
+ +
+ {errorMessage ? ( + + + + {errorMessage} + + + ) : isLoading ? ( + + + Loading record... + + + Fetching the latest data for this record. + + + ) : !table || !record ? ( + + + Record not found + + + We couldn't locate this record in the selected table. + + + ) : ( + <> + + + + )} + {table && record ? ( + void recordQuery.refetch()} + /> + ) : null} +
+
+
+ ); +} + +type RecordDetailCardProps = { + table: TableAggregate; + record: ITableRecordDto; + fields: Field[]; + baseId: string; + resolveRecordHref: (targetBaseId: string, targetTableId: string, recordId: string) => string; +}; + +function RecordDetailCard({ + table, + record, + fields, + baseId, + resolveRecordHref, +}: RecordDetailCardProps) { + return ( + + + {table.name().toString()} + + + + + + Field + Value + Type + + + + {fields.map((field) => { + const fieldId = field.id().toString(); + const value = record.fields[fieldId]; + const fieldType = field.type().toString(); + const fieldName = field.name().toString(); + const isLinkField = fieldType === 'link'; + const linkField = isLinkField ? (field as LinkField) : null; + const FieldIcon = getFieldTypeIcon(fieldType); + + let valueNode: ReactNode = null; + + if (fieldType === 'link') { + const linkItems = extractLinkValues(value); + const linkField = field as LinkField; + const targetBaseId = linkField.baseId()?.toString() ?? baseId; + const targetTableId = linkField.foreignTableId().toString(); + + valueNode = linkItems.length ? ( +
+ {linkItems.map((item) => ( + + {item.label} + + ))} +
+ ) : ( + - + ); + } else { + const formattedValue = formatRecordValue(field, value); + valueNode = formattedValue.node; + } + + return ( + + +
+ + {isLinkField && linkField ? ( + + ) : ( + {fieldName} + )} +
+
+ +
{valueNode}
+
+ + {field.type().toString()} + +
+ ); + })} +
+
+
+
+ ); +} + +type RealtimeStatusBadgeProps = { + status: ShareDbDocStatus; +}; + +function RealtimeStatusBadge({ status }: RealtimeStatusBadgeProps) { + const statusLabel = + status === 'ready' + ? 'Live' + : status === 'connecting' + ? 'Connecting' + : status === 'error' + ? 'Error' + : 'Idle'; + const variant = status === 'ready' ? 'secondary' : status === 'error' ? 'destructive' : 'outline'; + + return ( + + {status === 'ready' ? : null} + {statusLabel} + + ); +} + +type RealtimeRecordCardProps = { + realtimeRecord: ITableRecordRealtimeDTO | null; + status: ShareDbDocStatus; + error: string | null; +}; + +function RealtimeRecordCard({ realtimeRecord, status, error }: RealtimeRecordCardProps) { + console.log('[RealtimeRecordCard] render', { realtimeRecord, status, error }); + return ( + + +
+ Realtime Snapshot + +
+
+ + {error ? ( +
Realtime error: {error}
+ ) : !realtimeRecord ? ( +
+ {status === 'connecting' ? 'Connecting to ShareDB...' : 'Waiting for realtime data.'} +
+ ) : ( +
+
+ Record ID + {realtimeRecord.id} +
+
+ Table ID + {realtimeRecord.tableId} +
+
+
+ Fields ({Object.keys(realtimeRecord.fields).length}) +
+
+ {Object.entries(realtimeRecord.fields).map(([fieldId, value]) => ( +
+ + {fieldId} + +
+ {value === null || value === undefined ? ( + - + ) : typeof value === 'object' ? ( + {JSON.stringify(value)} + ) : ( + {String(value)} + )} +
+
+ ))} +
+
+
+ )} +
+
+ ); +} diff --git a/apps/playground/src/components/playground/PlaygroundShell.tsx b/apps/playground/src/components/playground/PlaygroundShell.tsx new file mode 100644 index 0000000000..5205bbda3f --- /dev/null +++ b/apps/playground/src/components/playground/PlaygroundShell.tsx @@ -0,0 +1,1153 @@ +import type { IBaseDto, ITableDto } from '@teable/v2-contract-http'; +import { Link, useNavigate } from '@tanstack/react-router'; +import { + ArrowRight, + Check, + ChevronDown, + ChevronsUpDown, + Cog, + Copy, + Database, + FlaskConical, + GalleryVerticalEnd, + Globe, + Pin, + Plus, + Search, + Table as TableIcon, + Trash2, + TriangleAlert, +} from 'lucide-react'; +import { useCallback, useEffect, useState, useRef, type FormEvent, type ReactNode } from 'react'; +import { useCopyToClipboard, useLocalStorage } from 'usehooks-ts'; +import { toast } from 'sonner'; +import { + Sidebar, + SidebarContent, + SidebarFooter, + SidebarGroup, + SidebarGroupContent, + SidebarGroupLabel, + SidebarHeader, + SidebarInput, + SidebarInset, + SidebarMenu, + SidebarMenuAction, + SidebarMenuBadge, + SidebarMenuButton, + SidebarMenuItem, + SidebarMenuSkeleton, + SidebarProvider, + SidebarRail, + SidebarSeparator, +} from '@/components/ui/sidebar'; +import { ScrollArea } from '@/components/ui/scroll-area'; +import { + DropdownMenu, + DropdownMenuContent, + DropdownMenuGroup, + DropdownMenuItem, + DropdownMenuLabel, + DropdownMenuSeparator, + DropdownMenuTrigger, +} from '@/components/ui/dropdown-menu'; +import { + AlertDialog, + AlertDialogAction, + AlertDialogCancel, + AlertDialogContent, + AlertDialogDescription, + AlertDialogFooter, + AlertDialogHeader, + AlertDialogTitle, +} from '@/components/ui/alert-dialog'; +import { Badge } from '@/components/ui/badge'; +import { Button } from '@/components/ui/button'; +import { + Dialog, + DialogContent, + DialogDescription, + DialogFooter, + DialogHeader, + DialogTitle, +} from '@/components/ui/dialog'; +import { Input } from '@/components/ui/input'; +import { Label } from '@/components/ui/label'; +import { Switch } from '@/components/ui/switch'; +import { Textarea } from '@/components/ui/textarea'; +import { + usePlaygroundEnvironment, + resolvePlaygroundEnvironment, +} from '@/lib/playground/environment'; +import { + PLAYGROUND_DB_CONNECTIONS_STORAGE_KEY, + PLAYGROUND_DB_URL_STORAGE_KEY, + createPlaygroundDbConnectionId, + findPlaygroundDbConnectionByUrl, + formatPlaygroundDbUrlLabel, + isValidPlaygroundDbUrl, + maskPlaygroundDbUrl, + normalizePlaygroundDbUrl, + resolvePlaygroundDbStorageKey, + sortPlaygroundDbConnections, + type PlaygroundDbConnection, +} from '@/lib/playground/databaseUrl'; +import { cn } from '@/lib/utils'; + +type PlaygroundShellProps = { + baseId: string; + bases: ReadonlyArray; + isLoadingBases: boolean; + onCreateBase: (name: string) => void; + isCreatingBase: boolean; + activeTableId: string | null; + tables: ReadonlyArray; + isInitialLoading: boolean; + errorMessage: string | null; + searchValue: string; + onSearchChange: (value: string) => void; + onDeleteTable: (table: ITableDto) => void; + isDeletingTable: boolean; + children: ReactNode; +}; + +type DbConnectionDraft = { + id: string | null; + name: string; + description: string; + url: string; + pinned: boolean; +}; + +type NavigationTarget = + | { to: string; params: { baseId: string } } + | { to: string; params: { baseId: string; tableId: string } } + | { to: string; params?: undefined }; + +export function PlaygroundShell({ + baseId, + bases, + isLoadingBases, + onCreateBase, + isCreatingBase, + activeTableId, + tables, + isInitialLoading, + errorMessage, + searchValue, + onSearchChange, + onDeleteTable, + isDeletingTable, + children, +}: PlaygroundShellProps) { + const env = usePlaygroundEnvironment(); + const isSandbox = env.kind === 'sandbox'; + + return ( +
+
+
+ {isSandbox ? ( +
+
+ SANDBOX +
+
+ ) : null} +
+ + + + {children} + + +
+
+ ); +} + +type PlaygroundSidebarProps = { + baseId: string; + bases: ReadonlyArray; + isLoadingBases: boolean; + onCreateBase: (name: string) => void; + isCreatingBase: boolean; + activeTableId: string | null; + tables: ReadonlyArray; + isInitialLoading: boolean; + errorMessage: string | null; + searchValue: string; + onSearchChange: (value: string) => void; + onDeleteTable: (table: ITableDto) => void; + isDeletingTable: boolean; +}; + +function PlaygroundSidebar({ + baseId, + bases, + isLoadingBases, + onCreateBase, + isCreatingBase, + activeTableId, + tables, + isInitialLoading, + errorMessage, + searchValue, + onSearchChange, + onDeleteTable, + isDeletingTable, +}: PlaygroundSidebarProps) { + const navigate = useNavigate(); + const env = usePlaygroundEnvironment(); + const isSandbox = env.kind === 'sandbox'; + const sandboxEnv = resolvePlaygroundEnvironment('/sandbox'); + const remoteEnv = resolvePlaygroundEnvironment('/'); + const activeEnv = isSandbox ? sandboxEnv : remoteEnv; + const [nextBaseId, setNextBaseId] = useState(baseId); + const [baseDropdownOpen, setBaseDropdownOpen] = useState(false); + const [newBaseName, setNewBaseName] = useState(''); + const [deleteTarget, setDeleteTarget] = useState(null); + const menuRef = useRef(null); + const [dbManagerOpen, setDbManagerOpen] = useState(false); + const [connectionDraft, setConnectionDraft] = useState({ + id: null, + name: '', + description: '', + url: '', + pinned: false, + }); + const [connectionError, setConnectionError] = useState(null); + const [connectionTestStatus, setConnectionTestStatus] = useState< + 'idle' | 'loading' | 'success' | 'error' + >('idle'); + const [connectionTestMessage, setConnectionTestMessage] = useState(null); + const [dbUrl, setDbUrl, removeDbUrl] = useLocalStorage( + PLAYGROUND_DB_URL_STORAGE_KEY, + null, + { initializeWithValue: false } + ); + const [dbConnections, setDbConnections] = useLocalStorage( + PLAYGROUND_DB_CONNECTIONS_STORAGE_KEY, + [], + { initializeWithValue: false } + ); + const [, copyToClipboard] = useCopyToClipboard(); + + const resetConnectionDraft = useCallback((connection?: PlaygroundDbConnection) => { + if (connection) { + setConnectionDraft({ + id: connection.id, + name: connection.name, + description: connection.description ?? '', + url: connection.url, + pinned: Boolean(connection.pinned), + }); + } else { + setConnectionDraft({ + id: null, + name: '', + description: '', + url: '', + pinned: false, + }); + } + setConnectionError(null); + setConnectionTestStatus('idle'); + setConnectionTestMessage(null); + }, []); + + const updateConnectionDraft = (updates: Partial) => { + setConnectionDraft((prev) => ({ ...prev, ...updates })); + setConnectionError(null); + setConnectionTestStatus('idle'); + setConnectionTestMessage(null); + }; + + useEffect(() => { + setNextBaseId(baseId); + }, [baseId]); + + useEffect(() => { + if (activeTableId && menuRef.current) { + const activeElement = menuRef.current.querySelector('[data-active="true"]'); + activeElement?.scrollIntoView({ + behavior: 'smooth', + block: 'nearest', + }); + } + }, [activeTableId]); + + useEffect(() => { + if (!dbManagerOpen) return; + resetConnectionDraft(); + }, [dbManagerOpen, resetConnectionDraft]); + + const trimmedBaseId = nextBaseId.trim(); + const canSwitchBase = trimmedBaseId.length > 0 && trimmedBaseId !== baseId; + const tableSkeletonKeys = ['table-skeleton-0', 'table-skeleton-1', 'table-skeleton-2']; + + const handleBaseSubmit = (event: FormEvent) => { + event.preventDefault(); + if (!canSwitchBase) return; + void navigate({ + to: env.routes.base, + params: { baseId: trimmedBaseId }, + search: {}, + }); + }; + + const handleDeleteConfirm = () => { + if (!deleteTarget) return; + onDeleteTable(deleteTarget); + setDeleteTarget(null); + }; + + const reloadPlayground = () => { + if (typeof window !== 'undefined') { + window.location.reload(); + } + }; + + const handleConnectionSave = (event: FormEvent) => { + event.preventDefault(); + const trimmedName = connectionDraft.name.trim(); + const trimmedUrl = normalizePlaygroundDbUrl(connectionDraft.url); + const trimmedDescription = connectionDraft.description.trim(); + + if (!trimmedName) { + setConnectionError('Connection name is required.'); + return; + } + if (!trimmedUrl) { + setConnectionError('Enter a database URL first.'); + return; + } + if (!isValidPlaygroundDbUrl(trimmedUrl)) { + setConnectionError('Use a postgres:// or postgresql:// URL.'); + return; + } + + const now = Date.now(); + const existing = connectionDraft.id + ? dbConnections.find((item) => item.id === connectionDraft.id) + : undefined; + const nextConnection: PlaygroundDbConnection = { + id: connectionDraft.id ?? createPlaygroundDbConnectionId(), + name: trimmedName, + description: trimmedDescription ? trimmedDescription : undefined, + url: trimmedUrl, + pinned: connectionDraft.pinned, + createdAt: existing?.createdAt ?? now, + lastUsedAt: existing?.lastUsedAt, + }; + + const nextConnections = connectionDraft.id + ? dbConnections.map((item) => (item.id === connectionDraft.id ? nextConnection : item)) + : [...dbConnections, nextConnection]; + + setDbConnections(nextConnections); + + if ( + existing && + dbUrl && + normalizePlaygroundDbUrl(existing.url) === normalizePlaygroundDbUrl(dbUrl) + ) { + if (normalizePlaygroundDbUrl(existing.url) !== normalizePlaygroundDbUrl(nextConnection.url)) { + setDbUrl(nextConnection.url); + setDbManagerOpen(false); + reloadPlayground(); + return; + } + } + + resetConnectionDraft(); + }; + + const handleConnectionTest = async () => { + const trimmedUrl = normalizePlaygroundDbUrl(connectionDraft.url); + if (!trimmedUrl) { + setConnectionTestStatus('error'); + setConnectionTestMessage('Enter a database URL first.'); + return; + } + if (!isValidPlaygroundDbUrl(trimmedUrl)) { + setConnectionTestStatus('error'); + setConnectionTestMessage('Use a postgres:// or postgresql:// URL.'); + return; + } + setConnectionTestStatus('loading'); + setConnectionTestMessage('Testing connection...'); + try { + const response = await fetch('/api/db/check', { + method: 'POST', + headers: { + 'Content-Type': 'application/json', + }, + body: JSON.stringify({ connectionString: trimmedUrl }), + }); + const payload = (await response.json().catch(() => null)) as { + ok?: boolean; + error?: string; + } | null; + if (!response.ok || payload?.ok === false) { + const message = payload?.error ?? 'Connection failed.'; + setConnectionTestStatus('error'); + setConnectionTestMessage(message); + return; + } + setConnectionTestStatus('success'); + setConnectionTestMessage('Connection OK.'); + } catch (error) { + const message = error instanceof Error ? error.message : 'Connection failed.'; + setConnectionTestStatus('error'); + setConnectionTestMessage(message); + } + }; + + const handleConnectionCopy = useCallback( + async (connection: PlaygroundDbConnection) => { + const didCopy = await copyToClipboard(connection.url); + if (didCopy) { + toast.success('Database URL copied to clipboard'); + } else { + toast.error('Failed to copy database URL'); + } + }, + [copyToClipboard] + ); + + const handleConnectionEdit = (connection: PlaygroundDbConnection) => { + setDbManagerOpen(true); + resetConnectionDraft(connection); + }; + + const handleConnectionDelete = (connection: PlaygroundDbConnection) => { + setDbConnections(dbConnections.filter((item) => item.id !== connection.id)); + if (dbUrl && normalizePlaygroundDbUrl(connection.url) === normalizePlaygroundDbUrl(dbUrl)) { + void handleUseDefault(); + } + }; + + const handleConnectionSwitch = async (connection: PlaygroundDbConnection) => { + const now = Date.now(); + setDbConnections( + dbConnections.map((item) => (item.id === connection.id ? { ...item, lastUsedAt: now } : item)) + ); + setDbUrl(connection.url); + setDbManagerOpen(false); + const next = resolveTargetPath(activeEnv, { + connectionId: connection.id, + dbUrl: connection.url, + }); + await navigateToTarget(next); + reloadPlayground(); + }; + + const handleUseDefault = async () => { + removeDbUrl(); + setDbManagerOpen(false); + const next = resolveTargetPath(activeEnv, { connectionId: null, dbUrl: null }); + await navigateToTarget(next); + reloadPlayground(); + }; + + const sortedConnections = sortPlaygroundDbConnections(dbConnections); + const activeConnection = findPlaygroundDbConnectionByUrl(dbConnections, dbUrl); + const dbLabel = dbUrl + ? activeConnection?.name ?? formatPlaygroundDbUrlLabel(dbUrl) + : 'Default (.env)'; + + const readStoredValue = (key: string): string | null => { + if (typeof window === 'undefined') return null; + const raw = window.localStorage.getItem(key); + if (!raw) return null; + try { + const parsed = JSON.parse(raw); + if (typeof parsed === 'string') return parsed.trim() || null; + if (parsed === null || parsed === undefined) return null; + } catch { + return raw.trim() || null; + } + return null; + }; + + const resolveStorageKeys = ( + target: typeof activeEnv, + options: { connectionId?: string | null; dbUrl?: string | null } + ) => { + if (target.kind === 'sandbox') { + return target.storageKeys; + } + return { + baseId: resolvePlaygroundDbStorageKey(target.storageKeys.baseId, options), + tableId: resolvePlaygroundDbStorageKey(target.storageKeys.tableId, options), + }; + }; + + const resolveTargetPath = ( + target: typeof activeEnv, + options: { connectionId?: string | null; dbUrl?: string | null } + ): NavigationTarget => { + if (typeof window === 'undefined') { + return { to: target.routes.base, params: { baseId: target.defaults.baseId } }; + } + const storageKeys = resolveStorageKeys(target, options); + const storedBaseId = readStoredValue(storageKeys.baseId); + const storedTableId = readStoredValue(storageKeys.tableId); + const baseId = storedBaseId || (target.kind === 'sandbox' ? target.defaults.baseId : null); + if (!baseId) { + return { to: target.routes.index }; + } + if (storedTableId) { + return { to: target.routes.table, params: { baseId, tableId: storedTableId } }; + } + return { to: target.routes.base, params: { baseId } }; + }; + + const navigateToTarget = async (target: NavigationTarget) => { + if (target.params) { + await navigate({ to: target.to, params: target.params, search: {} }); + return; + } + await navigate({ to: target.to, search: {} }); + }; + + const handleEnvSwitch = (target: typeof activeEnv) => { + const next = resolveTargetPath(target, { + connectionId: activeConnection?.id ?? null, + dbUrl, + }); + void navigateToTarget(next); + }; + + return ( + <> + + + + + +
+
+ +
+
+ + Teable v2 + + + Playground + +
+
+
+
+
+ + + + Base + + +
+ + + + + + Switch Base + +
+ + {isLoadingBases ? ( + Loading bases... + ) : bases.length ? ( + bases.map((base) => ( + { + setBaseDropdownOpen(false); + void navigate({ + to: env.routes.base, + params: { baseId: base.id }, + search: {}, + }); + }} + > + + {base.name} + + )) + ) : ( + No bases found + )} + +
+ + +
+
{ + e.preventDefault(); + const name = newBaseName.trim(); + if (name) { + onCreateBase(name); + setNewBaseName(''); + setBaseDropdownOpen(false); + } + }} + className="flex items-center gap-1" + > + setNewBaseName(e.target.value)} + className="h-7 text-xs" + disabled={isCreatingBase} + /> + +
+
+
+
+
+
+ setNextBaseId(event.target.value)} + aria-label="Base ID" + spellCheck={false} + className="h-8 text-xs bg-background/70 border-border/60 focus:border-primary/40" + /> + + +
+
+
+ + + + Tables + + +
+
+ + onSearchChange(event.target.value)} + maxLength={255} + aria-label="Search tables" + className="pl-8 bg-background/70 border-border/60 focus:border-primary/40" + /> +
+
+
+
+ +
+ + + + + +
+ {isInitialLoading ? ( + + {tableSkeletonKeys.map((key) => ( + + + + ))} + + ) : errorMessage ? ( +
+ + {errorMessage} +
+ ) : tables.length ? ( + + {tables.map((table) => { + const isActive = table.id === activeTableId; + return ( + + + ({ + ...prev, + ...(searchValue ? { q: searchValue } : {}), + })} + > + + {table.name} + + + setDeleteTarget(table)} + aria-label={`Delete ${table.name}`} + disabled={isDeletingTable} + > + + + + {table.fields.length} + + + ); + })} + + ) : ( +
+
+ +
+

No tables found

+

+ Create a table to get started +

+
+ )} +
+
+
+
+
+ + + + + +
+ +
+
+
+ System + Computed Tasks +
+
+ +
+
+ + + + +
+ {isSandbox ? ( + + ) : ( + + )} +
+
+
+ + Environment + + + {isSandbox ? 'Sandbox' : 'Remote'} + +
+ +
+
+
+ + Switch environment + + + handleEnvSwitch(remoteEnv)} + disabled={activeEnv.kind === 'remote'} + > + + Remote + + handleEnvSwitch(sandboxEnv)} + disabled={activeEnv.kind === 'sandbox'} + > + + Sandbox + + + + Database + + setDbManagerOpen(true)} + > + + Manage connections + + {dbUrl ? ( + + + Use default (.env) + + ) : null} + + + Saved connections + + {sortedConnections.length ? ( + sortedConnections.map((connection) => { + const isActive = activeConnection?.id === connection.id; + return ( + handleConnectionSwitch(connection)} + disabled={isActive} + > + + {connection.name} + {connection.pinned ? ( + + ) : null} + + ); + }) + ) : ( + + No saved connections yet + + )} + {`Active: ${dbLabel}`} + + +
+
+
+
+ +
+ + + + Database connections + + Store multiple database URLs locally, switch quickly, and reload the playground when + needed. + + +
+
+
+
+

+ {connectionDraft.id ? 'Edit connection' : 'New connection'} +

+

+ {connectionDraft.id + ? 'Update name, description, or URL.' + : 'Add a connection saved in this browser.'} +

+
+ {connectionDraft.id ? ( + + ) : null} +
+
+
+ + updateConnectionDraft({ name: event.target.value })} + spellCheck={false} + /> +
+
+ +