Skip to content

Conversation

@ymc9
Copy link
Member

@ymc9 ymc9 commented Jul 2, 2025

Summary by CodeRabbit

  • New Features

    • Enhanced CLI with options to suppress output (--silent) and to save the generated Prisma schema to a custom or default location.
    • Expanded support for array and complex default values in schema generation.
    • Improved CLI commands for database and migration operations with better temporary file handling and option flexibility.
    • Added new utility functions and helpers for CLI and test environments.
  • Bug Fixes

    • Improved error handling and resource cleanup for CLI commands involving temporary schema files.
  • Tests

    • Introduced comprehensive test suites for CLI commands (generate, db, migrate, init) and end-to-end schema validation for real-world projects (Cal.com, Formbricks, Trigger.dev).
    • Added utilities to streamline test project setup and CLI execution in tests.
  • Chores

    • Updated package versions to 3.0.0-alpha.7 across all packages.
    • Included additional directories in workspace configuration for better test coverage.
  • Documentation

    • No user-facing documentation changes included in this release.

ymc9 and others added 4 commits June 30, 2025 09:28
* fix: several issues with ts schema generation

* addressing review comments
* fix: schema generation for unsupported field type

* fix build

* update

* more tests

* update
Copilot AI review requested due to automatic review settings July 2, 2025 08:10
@coderabbitai
Copy link

coderabbitai bot commented Jul 2, 2025

Walkthrough

This update introduces temporary Prisma schema file generation and cleanup for CLI database and migration commands, enhances test coverage with new CLI and end-to-end tests, and adds support for complex default values in schema generation. Several packages are version-bumped, and workspace/test infrastructure is improved for better modularity and extensibility.

Changes

File(s) / Group Change Summary
.github/workflows/publish-release.yml Removed prerelease flag from GitHub release workflow.
package.json, packages/*/package.json, samples/blog/package.json Version bumped from 3.0.0-alpha.6 to 3.0.0-alpha.7.
packages/cli/package.json Version bump, reordered dependencies, added tmp and @types/tmp as devDependencies.
packages/cli/src/actions/action-utils.ts Added generateTempPrismaSchema for temp schema file generation.
packages/cli/src/actions/db.ts Refactored to use temp schema files, extended options, improved error/resource handling.
packages/cli/src/actions/generate.ts Added savePrismaSchema option for conditional Prisma schema output.
packages/cli/src/actions/migrate.ts Refactored to use temp schema files, introduced typed options, improved cleanup and argument handling.
packages/cli/src/index.ts Added --silent and --save-prisma-schema options to generate command.
packages/cli/test/db.test.ts, packages/cli/test/generate.test.ts, packages/cli/test/init.test.ts, packages/cli/test/migrate.test.ts Added new Vitest-based CLI command tests.
packages/cli/test/utils.ts Added helpers for project setup and CLI invocation in tests.
packages/common-helpers/package.json, packages/create-zenstack/package.json, packages/eslint-config/package.json, packages/language/package.json, packages/runtime/package.json, packages/sdk/package.json, packages/tanstack-query/package.json, packages/typescript-config/package.json, packages/zod/package.json Version bumps only.
packages/ide/vscode/package.json Changed version from stable to pre-release alpha.
packages/sdk/src/schema/schema.ts Allowed default in FieldDef to accept arrays.
packages/sdk/src/ts-schema-generator.ts Enhanced default value handling, added array and complex expression support, unified type literal generation, deduplicated unique/id keys.
packages/testtools/src/index.ts Re-exported all from ./project.
packages/testtools/src/project.ts Added createTestProject for temp project env setup.
packages/testtools/src/schema.ts Refactored to use createTestProject for test env setup.
pnpm-workspace.yaml Included tests/** in workspace packages.
tests/e2e/package.json, tests/e2e/vitest.config.ts Added E2E test package and Vitest config.
tests/e2e/cal.com/cal-com.test.ts, tests/e2e/formbricks/formbricks.test.ts, tests/e2e/trigger.dev/trigger-dev.test.ts Added E2E test files for real-world schemas.
tests/e2e/cal.com/schema.zmodel, tests/e2e/formbricks/schema.zmodel, tests/e2e/trigger.dev/schema.zmodel Added comprehensive real-world Prisma schemas for E2E testing.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant CLI
    participant ActionUtils
    participant FileSystem
    participant PrismaCLI

    User->>CLI: Run db/migrate command
    CLI->>ActionUtils: generateTempPrismaSchema(zmodelPath)
    ActionUtils->>FileSystem: Write temp schema file
    ActionUtils-->>CLI: Return temp schema path
    CLI->>PrismaCLI: Execute command with temp schema
    PrismaCLI-->>CLI: Command completes
    CLI->>FileSystem: Delete temp schema file
    CLI-->>User: Output result
Loading
sequenceDiagram
    participant User
    participant CLI
    participant GenerateAction
    participant FileSystem

    User->>CLI: Run generate command [--save-prisma-schema [path]]
    CLI->>GenerateAction: run(options)
    alt savePrismaSchema specified
        GenerateAction->>FileSystem: Write Prisma schema to path
    end
    GenerateAction->>FileSystem: Write TypeScript schema
    CLI-->>User: Output result
Loading

Poem

A rabbit hops with schema delight,
Temp files appear, then vanish from sight.
CLI commands now tested with care,
Real-world models everywhere!
With each new version, we leap ahead—
Arrays for defaults, more tests to embed.
🐇✨ The warren grows strong, onward we tread!

Warning

There were issues while running some tools. Please review the errors and either fix the tool's configuration or disable the tool if it's a critical failure.

🔧 ESLint

If the error stems from missing dependencies, add them to the package.json file. For unrecoverable errors (e.g., due to private dependencies), disable the tool in the CodeRabbit configuration.

packages/cli/src/actions/action-utils.ts

Oops! Something went wrong! :(

ESLint: 9.29.0

ESLint couldn't find an eslint.config.(js|mjs|cjs) file.

From ESLint v9.0.0, the default configuration file is now eslint.config.js.
If you are using a .eslintrc.* file, please follow the migration guide
to update your configuration file to the new format:

https://eslint.org/docs/latest/use/configure/migration-guide

If you still have problems after following the migration guide, please stop by
https://eslint.org/chat/help to chat with the team.

packages/cli/src/actions/migrate.ts

Oops! Something went wrong! :(

ESLint: 9.29.0

ESLint couldn't find an eslint.config.(js|mjs|cjs) file.

From ESLint v9.0.0, the default configuration file is now eslint.config.js.
If you are using a .eslintrc.* file, please follow the migration guide
to update your configuration file to the new format:

https://eslint.org/docs/latest/use/configure/migration-guide

If you still have problems after following the migration guide, please stop by
https://eslint.org/chat/help to chat with the team.

packages/cli/src/actions/db.ts

Oops! Something went wrong! :(

ESLint: 9.29.0

ESLint couldn't find an eslint.config.(js|mjs|cjs) file.

From ESLint v9.0.0, the default configuration file is now eslint.config.js.
If you are using a .eslintrc.* file, please follow the migration guide
to update your configuration file to the new format:

https://eslint.org/docs/latest/use/configure/migration-guide

If you still have problems after following the migration guide, please stop by
https://eslint.org/chat/help to chat with the team.

  • 16 others
✨ Finishing Touches
  • 📝 Generate Docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR merges development branch updates into main and introduces end-to-end schema generation tests for Trigger.dev and Formbricks.

  • Added a Vitest config in tests/e2e/vitest.config.ts to extend the base test setup.
  • Created e2e tests for both Trigger.dev and Formbricks schemas with their corresponding Prisma schema fixtures.
  • Introduced a dedicated tests/e2e/package.json to manage e2e dependencies and run scripts.

Reviewed Changes

Copilot reviewed 39 out of 40 changed files in this pull request and generated 2 comments.

Show a summary per file
File Description
tests/e2e/vitest.config.ts Merge base Vitest config for e2e tests
tests/e2e/trigger.dev/trigger-dev.test.ts Add e2e test for Trigger.dev schema generation
tests/e2e/trigger.dev/schema.zmodel Add Prisma schema fixture for Trigger.dev
tests/e2e/formbricks/formbricks.test.ts Add e2e test for Formbricks schema generation
tests/e2e/formbricks/schema.zmodel Add Prisma schema fixture for Formbricks
tests/e2e/package.json Define e2e scripts and dependencies
Files not reviewed (1)
  • pnpm-lock.yaml: Language not supported
Comments suppressed due to low confidence (2)

tests/e2e/trigger.dev/trigger-dev.test.ts:6

  • [nitpick] Directory name trigger.dev contains a dot while the test filename uses hyphens (trigger-dev.test.ts); consider renaming the directory to trigger-dev or aligning naming conventions for clarity and cross-platform consistency.
describe('Trigger.dev e2e tests', () => {

tests/e2e/trigger.dev/trigger-dev.test.ts:8

  • This test only asserts that generateTsSchema resolves truthily. To catch regressions in schema output, add snapshot tests or specific assertions on the generated TypeScript definitions.
        await expect(

@ymc9 ymc9 added this pull request to the merge queue Jul 2, 2025
Merged via the queue into main with commit f638291 Jul 2, 2025
5 of 6 checks passed
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 10

🔭 Outside diff range comments (1)
packages/cli/src/actions/migrate.ts (1)

97-103: Consolidate handleSubProcessError into a single implementation

The helper is already exported from packages/cli/src/actions/action-utils.ts; migrate.ts should import and reuse it rather than define a duplicate.

• In packages/cli/src/actions/migrate.ts, update the import:

-import { generateTempPrismaSchema, getSchemaFile } from './action-utils';
+import { generateTempPrismaSchema, getSchemaFile, handleSubProcessError } from './action-utils';

• Remove the local definition at the bottom of the file (lines 97–103):

- function handleSubProcessError(err: unknown) {
-     if (err instanceof Error && 'status' in err && typeof err.status === 'number') {
-         process.exit(err.status);
-     } else {
-         process.exit(1);
-     }
- }
🧹 Nitpick comments (9)
package.json (1)

3-3: Version bump looks good – mind the supporting release assets

The jump to 3.0.0-alpha.7 is consistent with the rest of the workspace.
Just verify that the changelog and Git tags are generated in the same commit so automation (e.g., turbo release, npm publish) can detect the correct version.

packages/tanstack-query/package.json (1)

3-3: Version bump acknowledged – double-check peer range

@zenstackhq/runtime is still referenced via workspace:*, which resolves correctly, but make sure the published tarball ends up with the final semver range (usually replaced by the publish script).
No blocking issue, just a reminder.

tests/e2e/cal.com/schema.zmodel (2)

34-34: Fix typo in comment.

-    // weightAdjustment is deprecated. We not calculate the calibratino value on the spot. Plan to drop this column.
+    // weightAdjustment is deprecated. We now calculate the calibration value on the spot. Plan to drop this column.

26-2366: Well-structured schema with comprehensive modeling.

The schema demonstrates good practices:

  • Appropriate indexes on foreign keys and frequently queried fields
  • Proper cascade delete configurations for referential integrity
  • Clear deprecation comments for fields planned for removal
  • JSON field validation using Zod schemas

Consider documenting the migration plan for the deprecated fields to ensure smooth transitions.

packages/cli/src/actions/migrate.ts (1)

54-65: Consider using template literals for better readability

The command string construction is getting complex with multiple ternary operators.

-await execPackage(
-    `prisma migrate dev --schema "${prismaSchemaFile}" --skip-generate${options.name ? ` --name ${options.name}` : ''}${options.createOnly ? ' --create-only' : ''}`,
-    {
-        stdio: 'inherit',
-    },
-);
+const args = [
+    'prisma migrate dev',
+    `--schema "${prismaSchemaFile}"`,
+    '--skip-generate',
+    options.name && `--name ${options.name}`,
+    options.createOnly && '--create-only'
+].filter(Boolean).join(' ');
+
+await execPackage(args, {
+    stdio: 'inherit',
+});
tests/e2e/trigger.dev/schema.zmodel (1)

92-93: Consider using String or Bytes for encrypted data.

Storing encryptedToken as Json type is unusual. Encrypted data is typically binary or base64-encoded strings. Using Json might introduce unnecessary overhead or potential parsing issues.

Consider changing the type to String (for base64-encoded data) or Bytes (for raw binary data):

-  encryptedToken     Json
+  encryptedToken     String  // for base64-encoded encrypted data
tests/e2e/formbricks/schema.zmodel (3)

50-50: Consider using a junction table for webhook-survey relationships.

The surveyIds array field may have performance implications when querying webhooks by survey ID. For better query performance and flexibility, consider using a junction table pattern similar to TagsOnResponses.


119-119: Consider adding an index on the userId field.

If you frequently query contacts by their external userId, adding an index would improve query performance.

Add this index to the Contact model:

 @@index([environmentId])
+@@index([userId])

1-1102: Consider monitoring query performance for complex relationships.

This schema has many interconnected models with complex relationships. While the indexing strategy is good, consider:

  1. Monitoring slow queries in production
  2. Adding composite indexes for common query patterns
  3. Implementing query result caching for frequently accessed data
  4. Using database views for complex aggregations
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c225e62 and 26f050a.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (39)
  • .github/workflows/publish-release.yml (0 hunks)
  • package.json (1 hunks)
  • packages/cli/package.json (3 hunks)
  • packages/cli/src/actions/action-utils.ts (2 hunks)
  • packages/cli/src/actions/db.ts (1 hunks)
  • packages/cli/src/actions/generate.ts (2 hunks)
  • packages/cli/src/actions/migrate.ts (3 hunks)
  • packages/cli/src/index.ts (1 hunks)
  • packages/cli/test/db.test.ts (1 hunks)
  • packages/cli/test/generate.test.ts (1 hunks)
  • packages/cli/test/init.test.ts (1 hunks)
  • packages/cli/test/migrate.test.ts (1 hunks)
  • packages/cli/test/utils.ts (1 hunks)
  • packages/common-helpers/package.json (1 hunks)
  • packages/create-zenstack/package.json (1 hunks)
  • packages/eslint-config/package.json (1 hunks)
  • packages/ide/vscode/package.json (1 hunks)
  • packages/language/package.json (1 hunks)
  • packages/runtime/package.json (1 hunks)
  • packages/sdk/package.json (1 hunks)
  • packages/sdk/src/schema/schema.ts (1 hunks)
  • packages/sdk/src/ts-schema-generator.ts (8 hunks)
  • packages/tanstack-query/package.json (1 hunks)
  • packages/testtools/package.json (1 hunks)
  • packages/testtools/src/index.ts (1 hunks)
  • packages/testtools/src/project.ts (1 hunks)
  • packages/testtools/src/schema.ts (2 hunks)
  • packages/typescript-config/package.json (1 hunks)
  • packages/zod/package.json (1 hunks)
  • pnpm-workspace.yaml (1 hunks)
  • samples/blog/package.json (2 hunks)
  • tests/e2e/cal.com/cal-com.test.ts (1 hunks)
  • tests/e2e/cal.com/schema.zmodel (1 hunks)
  • tests/e2e/formbricks/formbricks.test.ts (1 hunks)
  • tests/e2e/formbricks/schema.zmodel (1 hunks)
  • tests/e2e/package.json (1 hunks)
  • tests/e2e/trigger.dev/schema.zmodel (1 hunks)
  • tests/e2e/trigger.dev/trigger-dev.test.ts (1 hunks)
  • tests/e2e/vitest.config.ts (1 hunks)
💤 Files with no reviewable changes (1)
  • .github/workflows/publish-release.yml
⏰ Context from checks skipped due to timeout of 90000ms (1)
  • GitHub Check: build-test (20.x)
🔇 Additional comments (44)
packages/eslint-config/package.json (1)

3-3: LGTM

Nothing else changed besides the version; no further action required.

packages/typescript-config/package.json (1)

3-3: LGTM

Version bump only; dependencies remain workspace-pinned. ✅

packages/runtime/package.json (1)

3-3: Core package version updated successfully

No problems spotted. Ensure downstream packages depending on @zenstackhq/runtime pick up this exact version during publish.

packages/zod/package.json (1)

3-3: Version bump is consistent with the coordinated α-7 release – LGTM
No further action required.

packages/language/package.json (1)

4-4: Aligned version increment – looks good
Matches the mono-repo release cadence.

packages/create-zenstack/package.json (1)

3-3: Release version updated correctly
Change is straightforward and consistent.

packages/ide/vscode/package.json (1)

4-4: Bump VS Code extension version above current Marketplace release

The version in packages/ide/vscode/package.json was changed from 3.0.3 → 3.0.0-alpha.7, which is a SemVer downgrade. vsce publish (and the Marketplace) will reject any version that is not higher than the last published one.

Please confirm the current release version for zenstack.zenstack in the VS Code Marketplace and update the version field to something greater (for example, bump the patch):

-    "version": "3.0.0-alpha.7",
+    "version": "3.0.4-alpha.7",

(or e.g. 3.1.0-alpha.7, etc.)

Ensure this is done before cutting the release to avoid Marketplace publication errors.

pnpm-workspace.yaml (1)

5-5: Ensure test projects are private to avoid accidental publishing

Including tests/** in the workspace is useful, but any package.json under tests/ should have "private": true to prevent pnpm publish/npm publish mishaps.

Double-check the test packages or add "publishConfig": { "access": "private" } if needed.

samples/blog/package.json (2)

3-3: LGTM! Version bump is consistent across the monorepo.

The version update from 3.0.0-alpha.6 to 3.0.0-alpha.7 aligns with the coordinated release across all ZenStack packages.


21-21: Catalog configuration for Prisma is correctly set up.

pnpm-workspace.yaml defines prisma: ^6.0.0 in the default catalog, and all package.json files (including samples/blog/package.json) reference "prisma": "catalog:" as intended.

  • Verified pnpm-workspace.yaml contains:
    catalog:
        prisma: ^6.0.0
    
  • Checked across the repo that "prisma": "catalog:" is used consistently.

No further action required.

tests/e2e/cal.com/schema.zmodel (1)

1-5: LGTM! PostgreSQL datasource properly configured.

The datasource configuration correctly sets up PostgreSQL with both standard and direct URLs, which is appropriate for connection pooling scenarios.

packages/sdk/package.json (1)

3-3: LGTM! Version bump aligned with monorepo release.

packages/common-helpers/package.json (1)

3-3: LGTM! Version bump aligned with monorepo release.

packages/testtools/package.json (1)

3-3: LGTM! Version bump aligned with monorepo release.

packages/testtools/src/index.ts (1)

1-1: LGTM! Clean API expansion.

The new export follows the established pattern and properly exposes the project module functionality to consumers of the testtools package.

packages/cli/package.json (3)

6-6: Version bump looks good.

The alpha version increment is appropriate for this feature release.


31-31: Dependency reordering is fine.

Moving @zenstackhq/common-helpers to the first position doesn't affect functionality.


46-52: New temporary file dependencies support CLI enhancements.

The addition of tmp and @types/tmp aligns with the temporary Prisma schema generation functionality mentioned in the PR objectives. The version ranges are appropriate.

packages/sdk/src/schema/schema.ts (1)

62-62: Array default values support looks good.

The addition of unknown[] to the default property type enables array default values as mentioned in the schema generation enhancements. While unknown[] is quite permissive, it provides the flexibility needed for varied default value scenarios.

tests/e2e/package.json (1)

1-11: Well-structured e2e test package configuration.

The package.json follows monorepo best practices with:

  • Appropriate private flag for test packages
  • Version consistency with the main project
  • Proper workspace dependency on testtools
  • Clean test script setup using vitest
tests/e2e/vitest.config.ts (1)

1-4: Clean Vitest configuration setup.

The configuration properly extends the base Vitest config using the mergeConfig pattern. The empty override configuration allows for future e2e-specific customizations while currently using the base setup.

packages/cli/test/generate.test.ts (6)

6-10: Well-structured test model.

The simple User model with cuid() default is appropriate for testing purposes and provides sufficient complexity to validate schema generation functionality.


13-18: Solid baseline test case.

The test correctly verifies that the default behavior generates a TypeScript schema file while not generating a Prisma schema file unless explicitly requested.


20-24: Good coverage of custom output directory option.

The test properly validates that the --output option is respected and files are generated in the specified directory.


33-37: Correct validation of save prisma schema flag.

The test appropriately verifies that the --save-prisma-schema flag creates the expected Prisma schema file.


26-31: Confirm temp‐workspace cleanup in createProject

The createProject helper uses createTestProject (from @zenstackhq/testtools), but we couldn’t find any local cleanup logic. To prevent leftover temp directories after tests:

  • Verify that createTestProject automatically removes or reuses its workspace (e.g., via teardown hooks or on process exit).
  • If it doesn’t, consider adding explicit cleanup in your tests or enhancing the utility to delete the workspace after each run.

39-43: No workspace isolation issue with custom schema path

The --save-prisma-schema "../prisma/schema.prisma" value is resolved against the output directory (workDir/zenstack), producing workDir/prisma/schema.prisma. This path stays inside the test workspace, so there is no need for additional isolation safeguards or changes.

Likely an incorrect or invalid review comment.

packages/cli/test/db.test.ts (1)

6-18: Clean and focused test implementation.

The test effectively validates the core functionality of the db push command by checking for the expected database file creation. The structure is consistent with other CLI tests and uses the same reliable test utilities.

tests/e2e/cal.com/cal-com.test.ts (1)

6-12: Proper e2e test implementation.

The test correctly uses the 'cal-com' project identifier and follows the established pattern for e2e schema generation testing.

packages/cli/src/index.ts (1)

52-58: Well-implemented CLI option additions.

The new --silent and --save-prisma-schema options are properly structured:

  • The silent flag follows standard CLI conventions
  • The save-prisma-schema option correctly uses optional parameter syntax [path] to allow both flag and value usage
  • Both options have clear, descriptive help text
  • Implementation follows existing Commander.js patterns in the codebase
packages/cli/src/actions/action-utils.ts (1)

2-2: LGTM! Import reorganization looks clean.

The import statements are well-organized and the new imports are appropriately added.

Also applies to: 4-6

packages/cli/test/init.test.ts (1)

1-13: LGTM! Well-structured CLI test.

The test is well-written with proper setup using temporary directories and clear verification of expected behavior. The use of unsafeCleanup: true ensures proper cleanup after test execution.

packages/cli/test/migrate.test.ts (1)

1-41: LGTM! Comprehensive migrate command test coverage.

The test suite provides excellent coverage of the migrate command workflows:

  • migrate dev with database and migration directory creation
  • migrate reset with database persistence verification
  • migrate deploy with database recreation
  • migrate status command execution

The tests are well-structured with clear setup, execution, and verification phases.

packages/cli/test/utils.ts (1)

12-18: LGTM!

The createProject function implementation is clean and follows best practices with proper directory creation and flexible schema content handling.

packages/cli/src/actions/generate.ts (1)

32-40: Well-implemented Prisma schema saving logic!

The implementation correctly handles both boolean and string types for the savePrismaSchema option, with proper path resolution and directory creation.

packages/testtools/src/schema.ts (1)

37-38: Good refactoring to centralize test project setup!

Replacing manual temp directory setup with createTestProject() improves maintainability and reduces code duplication.

packages/cli/src/actions/db.ts (1)

22-44: Excellent implementation of temporary schema handling!

The refactored runPush function properly handles temporary Prisma schema generation with appropriate error handling and cleanup in the finally block.

packages/sdk/src/ts-schema-generator.ts (3)

287-287: Good refactoring for consistency.

Using the dedicated generateFieldTypeLiteral helper method improves code maintainability and ensures consistent field type literal generation across the codebase.


633-661: Well-implemented deduplication logic.

The addition of seenKeys Set effectively prevents duplicate entries when the same field combination appears in both @@id and @@unique attributes. This is a good defensive programming practice.


688-700: Clean implementation of field type literal generation.

The generateFieldTypeLiteral method effectively centralizes the logic for generating field type literals with proper validation and fallback handling.

tests/e2e/formbricks/schema.zmodel (4)

711-735: Good security implementation for API keys.

The API key implementation follows security best practices with hashed keys and granular permission management through the ApiKeyEnvironment junction table.


886-886: Verify if unique constraint on url aligns with requirements.

The unique constraint on url prevents multiple short URLs from pointing to the same destination. This might limit use cases where different short URLs are needed for the same destination (e.g., for campaign tracking).


978-978: Appropriate use of pgvector for embeddings.

The implementation correctly uses Prisma's Unsupported type for pgvector integration. The 512-dimension vectors are suitable for most embedding models.

Also applies to: 1026-1026


1033-1101: Well-designed team-based access control.

The team permission structure follows RBAC best practices with clear separation of concerns between team membership (TeamUser) and project access (ProjectTeam). The composite primary keys and cascading deletes are properly configured.

Comment on lines +6 to +12
describe('Formbricks e2e tests', () => {
it('has a working schema', async () => {
await expect(
generateTsSchema(fs.readFileSync(path.join(__dirname, 'schema.zmodel'), 'utf8'), 'postgresql', 'cal-com'),
).resolves.toBeTruthy();
});
});
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Verify project identifier consistency.

The test is for Formbricks but uses 'cal-com' as the project identifier in the generateTsSchema call. This seems inconsistent and might be a copy-paste error from the Cal.com test.


🏁 Script executed:

#!/bin/bash
# Description: Check if 'cal-com' project identifier is intentional for Formbricks test
# Expected: Verify if this is correct or should be 'formbricks'

rg -A 5 -B 5 "generateTsSchema.*cal-com" tests/e2e/

Length of output: 2027


Replace project identifier in Formbricks e2e test

The Formbricks schema test currently passes 'cal-com' to generateTsSchema, which is a copy-paste from the Cal.com test. It should use the Formbricks identifier to match its own schema.

• tests/e2e/formbricks/formbricks.test.ts, around line 6
Change the third argument of generateTsSchema from 'cal-com' to 'formbricks'.

Suggested diff:

- await expect(
-   generateTsSchema(
-     fs.readFileSync(path.join(__dirname, 'schema.zmodel'), 'utf8'),
-     'postgresql',
-     'cal-com',
-   ),
- ).resolves.toBeTruthy();
+ await expect(
+   generateTsSchema(
+     fs.readFileSync(path.join(__dirname, 'schema.zmodel'), 'utf8'),
+     'postgresql',
+     'formbricks',
+   ),
+ ).resolves.toBeTruthy();

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In tests/e2e/formbricks/formbricks.test.ts around lines 6 to 12, the third
argument passed to generateTsSchema is incorrectly set to 'cal-com', which is
from a different project. Change this argument to 'formbricks' to correctly
match the Formbricks schema and ensure the test validates the right project
identifier.

Comment on lines +47 to +53
export async function generateTempPrismaSchema(zmodelPath: string) {
const model = await loadSchemaDocument(zmodelPath);
const prismaSchema = await new PrismaSchemaGenerator(model).generate();
const prismaSchemaFile = path.resolve(path.dirname(zmodelPath), '~schema.prisma');
fs.writeFileSync(prismaSchemaFile, prismaSchema);
return prismaSchemaFile;
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Address potential race conditions and improve error handling.

The current implementation has several concerns:

  1. Race condition risk: Using a fixed filename ~schema.prisma could cause conflicts if multiple processes run simultaneously in the same directory.
  2. Synchronous file operations: Using fs.writeFileSync in an async function context is inconsistent and could block the event loop.
  3. Missing error handling: No error handling for file write operations.

Consider this improved implementation:

 export async function generateTempPrismaSchema(zmodelPath: string) {
     const model = await loadSchemaDocument(zmodelPath);
     const prismaSchema = await new PrismaSchemaGenerator(model).generate();
-    const prismaSchemaFile = path.resolve(path.dirname(zmodelPath), '~schema.prisma');
-    fs.writeFileSync(prismaSchemaFile, prismaSchema);
+    const prismaSchemaFile = path.resolve(path.dirname(zmodelPath), `~schema-${Date.now()}-${Math.random().toString(36).substr(2, 9)}.prisma`);
+    await fs.promises.writeFile(prismaSchemaFile, prismaSchema);
     return prismaSchemaFile;
 }

This addresses the race condition with a unique filename and uses async file operations for consistency.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export async function generateTempPrismaSchema(zmodelPath: string) {
const model = await loadSchemaDocument(zmodelPath);
const prismaSchema = await new PrismaSchemaGenerator(model).generate();
const prismaSchemaFile = path.resolve(path.dirname(zmodelPath), '~schema.prisma');
fs.writeFileSync(prismaSchemaFile, prismaSchema);
return prismaSchemaFile;
}
export async function generateTempPrismaSchema(zmodelPath: string) {
const model = await loadSchemaDocument(zmodelPath);
const prismaSchema = await new PrismaSchemaGenerator(model).generate();
const prismaSchemaFile = path.resolve(
path.dirname(zmodelPath),
`~schema-${Date.now()}-${Math.random().toString(36).substr(2, 9)}.prisma`
);
await fs.promises.writeFile(prismaSchemaFile, prismaSchema);
return prismaSchemaFile;
}
🤖 Prompt for AI Agents
In packages/cli/src/actions/action-utils.ts around lines 47 to 53, the function
generateTempPrismaSchema uses a fixed filename '~schema.prisma' which can cause
race conditions if multiple processes run concurrently in the same directory. It
also uses synchronous file writing which blocks the event loop and lacks error
handling for file operations. To fix this, generate a unique temporary filename
for each call to avoid conflicts, replace fs.writeFileSync with the asynchronous
fs.promises.writeFile to maintain async consistency, and wrap the file writing
in a try-catch block to handle and propagate any errors properly.

Comment on lines +6 to +12
describe('Trigger.dev e2e tests', () => {
it('has a working schema', async () => {
await expect(
generateTsSchema(fs.readFileSync(path.join(__dirname, 'schema.zmodel'), 'utf8'), 'postgresql', 'cal-com'),
).resolves.toBeTruthy();
});
});
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix inconsistent project identifier.

The test is for Trigger.dev but uses 'cal-com' as the project identifier in the generateTsSchema call. This inconsistency could cause confusion about what's actually being tested.

Apply this fix to use the correct project identifier:

 describe('Trigger.dev e2e tests', () => {
     it('has a working schema', async () => {
         await expect(
-            generateTsSchema(fs.readFileSync(path.join(__dirname, 'schema.zmodel'), 'utf8'), 'postgresql', 'cal-com'),
+            generateTsSchema(fs.readFileSync(path.join(__dirname, 'schema.zmodel'), 'utf8'), 'postgresql', 'trigger-dev'),
         ).resolves.toBeTruthy();
     });
 });
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
describe('Trigger.dev e2e tests', () => {
it('has a working schema', async () => {
await expect(
generateTsSchema(fs.readFileSync(path.join(__dirname, 'schema.zmodel'), 'utf8'), 'postgresql', 'cal-com'),
).resolves.toBeTruthy();
});
});
describe('Trigger.dev e2e tests', () => {
it('has a working schema', async () => {
await expect(
generateTsSchema(
fs.readFileSync(path.join(__dirname, 'schema.zmodel'), 'utf8'),
'postgresql',
'trigger-dev'
),
).resolves.toBeTruthy();
});
});
🤖 Prompt for AI Agents
In tests/e2e/trigger.dev/trigger-dev.test.ts around lines 6 to 12, the project
identifier used in the generateTsSchema call is 'cal-com', which is inconsistent
with the Trigger.dev test context. Replace 'cal-com' with the correct project
identifier that matches Trigger.dev to maintain consistency and clarity in the
test.

Comment on lines +5 to +67
export function createTestProject() {
const { name: workDir } = tmp.dirSync({ unsafeCleanup: true });

fs.mkdirSync(path.join(workDir, 'node_modules'));

// symlink all entries from "node_modules"
const nodeModules = fs.readdirSync(path.join(__dirname, '../node_modules'));
for (const entry of nodeModules) {
if (entry.startsWith('@zenstackhq')) {
continue;
}
fs.symlinkSync(
path.join(__dirname, '../node_modules', entry),
path.join(workDir, 'node_modules', entry),
'dir',
);
}

// in addition, symlink zenstack packages
const zenstackPackages = ['language', 'sdk', 'runtime', 'cli'];
fs.mkdirSync(path.join(workDir, 'node_modules/@zenstackhq'));
for (const pkg of zenstackPackages) {
fs.symlinkSync(
path.join(__dirname, `../../${pkg}`),
path.join(workDir, `node_modules/@zenstackhq/${pkg}`),
'dir',
);
}

fs.writeFileSync(
path.join(workDir, 'package.json'),
JSON.stringify(
{
name: 'test',
version: '1.0.0',
type: 'module',
},
null,
4,
),
);

fs.writeFileSync(
path.join(workDir, 'tsconfig.json'),
JSON.stringify(
{
compilerOptions: {
module: 'ESNext',
target: 'ESNext',
moduleResolution: 'Bundler',
esModuleInterop: true,
skipLibCheck: true,
strict: true,
},
include: ['**/*.ts'],
},
null,
4,
),
);

return workDir;
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Improve error handling and path robustness.

The function is well-structured but could be more robust:

  1. Missing error handling: Symlink operations can fail and should be handled gracefully.
  2. Brittle relative paths: The hardcoded relative paths (../../${pkg}) assume a specific project structure that could change.

Consider adding error handling and more robust path resolution:

+import { fileURLToPath } from 'node:url';

 export function createTestProject() {
     const { name: workDir } = tmp.dirSync({ unsafeCleanup: true });

     fs.mkdirSync(path.join(workDir, 'node_modules'));

+    const currentDir = path.dirname(fileURLToPath(import.meta.url));
+    const nodeModulesPath = path.join(currentDir, '../node_modules');
     
     // symlink all entries from "node_modules"
-    const nodeModules = fs.readdirSync(path.join(__dirname, '../node_modules'));
+    if (!fs.existsSync(nodeModulesPath)) {
+        throw new Error(`node_modules not found at ${nodeModulesPath}`);
+    }
+    const nodeModules = fs.readdirSync(nodeModulesPath);
     
     for (const entry of nodeModules) {
         if (entry.startsWith('@zenstackhq')) {
             continue;
         }
-        fs.symlinkSync(
-            path.join(__dirname, '../node_modules', entry),
-            path.join(workDir, 'node_modules', entry),
-            'dir',
-        );
+        try {
+            fs.symlinkSync(
+                path.join(nodeModulesPath, entry),
+                path.join(workDir, 'node_modules', entry),
+                'dir',
+            );
+        } catch (error) {
+            console.warn(`Failed to symlink ${entry}:`, error);
+        }
     }

This makes the code more resilient to project structure changes and handles symlink failures gracefully.

🤖 Prompt for AI Agents
In packages/testtools/src/project.ts around lines 5 to 67, improve robustness by
adding try-catch blocks around all fs.symlinkSync calls to handle potential
errors gracefully without crashing. Replace hardcoded relative paths like
../../${pkg} with path.resolve or path.join using __dirname to dynamically
compute absolute paths, ensuring the code adapts to project structure changes.
This will make symlink creation safer and path resolution more reliable.

Comment on lines +20 to +23
export function runCli(command: string, cwd: string) {
const cli = path.join(__dirname, '../dist/index.js');
execSync(`node ${cli} ${command}`, { cwd });
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Potential command injection vulnerability

The command parameter is directly interpolated into the shell command without escaping, which could lead to command injection if untrusted input is passed.

Consider using execFileSync with arguments array instead:

-export function runCli(command: string, cwd: string) {
-    const cli = path.join(__dirname, '../dist/index.js');
-    execSync(`node ${cli} ${command}`, { cwd });
-}
+export function runCli(command: string, cwd: string) {
+    const cli = path.join(__dirname, '../dist/index.js');
+    const args = command.split(' ');
+    execSync('node', [cli, ...args], { cwd });
+}

Or use a more robust command parsing approach if complex arguments are needed.

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In packages/cli/test/utils.ts around lines 20 to 23, the runCli function uses
execSync with a command string that directly interpolates the command parameter,
risking command injection. To fix this, replace execSync with execFileSync and
pass the CLI path as the first argument and the command split into an arguments
array as the second argument, avoiding shell interpolation. This change ensures
safer execution by properly escaping arguments and preventing injection
vulnerabilities.

Comment on lines +369 to 383
if (Array.isArray(defaultValue)) {
objectFields.push(
ts.factory.createPropertyAssignment(
'default',
ts.factory.createArrayLiteralExpression(
defaultValue.map((item) => this.createLiteralNode(item as any)),
),
),
);
} else {
objectFields.push(
ts.factory.createPropertyAssignment('default', this.createLiteralNode(defaultValue)),
);
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

⚠️ Potential issue

Avoid casting to any - preserve type safety.

The cast to any on line 374 undermines TypeScript's type safety. Since defaultValue can be unknown[], consider preserving the type information.

                if (Array.isArray(defaultValue)) {
                    objectFields.push(
                        ts.factory.createPropertyAssignment(
                            'default',
                            ts.factory.createArrayLiteralExpression(
-                               defaultValue.map((item) => this.createLiteralNode(item as any)),
+                               defaultValue.map((item) => this.createLiteralNode(item)),
                            ),
                        ),
                    );
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (Array.isArray(defaultValue)) {
objectFields.push(
ts.factory.createPropertyAssignment(
'default',
ts.factory.createArrayLiteralExpression(
defaultValue.map((item) => this.createLiteralNode(item as any)),
),
),
);
} else {
objectFields.push(
ts.factory.createPropertyAssignment('default', this.createLiteralNode(defaultValue)),
);
}
}
if (Array.isArray(defaultValue)) {
objectFields.push(
ts.factory.createPropertyAssignment(
'default',
ts.factory.createArrayLiteralExpression(
defaultValue.map((item) => this.createLiteralNode(item)),
),
),
);
} else {
objectFields.push(
ts.factory.createPropertyAssignment('default', this.createLiteralNode(defaultValue)),
);
}
🤖 Prompt for AI Agents
In packages/sdk/src/ts-schema-generator.ts around lines 369 to 383, the code
casts items in defaultValue to any, which breaks type safety. To fix this,
remove the any cast and ensure the createLiteralNode method accepts the correct
type by properly typing defaultValue as unknown[] and passing items without
casting. Adjust the method signature or add type guards if necessary to maintain
type safety while handling unknown array elements.

Comment on lines +450 to 477
private getMappedValue(
expr: Expression,
fieldType: DataModelFieldType,
): string | number | boolean | unknown[] | { call: string; args: any[] } | { authMember: string[] } | undefined {
if (isLiteralExpr(expr)) {
const lit = (expr as LiteralExpr).value;
return fieldType.type === 'Boolean'
? (lit as boolean)
: ['Int', 'Float', 'Decimal', 'BigInt'].includes(field.type.type!)
: ['Int', 'Float', 'Decimal', 'BigInt'].includes(fieldType.type!)
? Number(lit)
: lit;
} else if (isReferenceExpr(defaultValue) && isEnumField(defaultValue.target.ref)) {
return defaultValue.target.ref.name;
} else if (isInvocationExpr(defaultValue)) {
} else if (isArrayExpr(expr)) {
return expr.items.map((item) => this.getMappedValue(item, fieldType));
} else if (isReferenceExpr(expr) && isEnumField(expr.target.ref)) {
return expr.target.ref.name;
} else if (isInvocationExpr(expr)) {
return {
call: defaultValue.function.$refText,
args: defaultValue.args.map((arg) => this.getLiteral(arg.value)),
call: expr.function.$refText,
args: expr.args.map((arg) => this.getLiteral(arg.value)),
};
} else if (this.isAuthMemberAccess(defaultValue)) {
} else if (this.isAuthMemberAccess(expr)) {
return {
authMember: this.getMemberAccessChain(defaultValue),
authMember: this.getMemberAccessChain(expr),
};
} else {
throw new Error(`Unsupported default value type for field ${field.name}`);
throw new Error(`Unsupported default value type for ${expr.$type}`);
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Enhance error handling and validation.

Two improvements to consider:

  1. Validate that expr.function.ref exists before using $refText
  2. Make the error message more descriptive
        } else if (isInvocationExpr(expr)) {
+           if (!expr.function.ref) {
+               throw new Error(`Invalid invocation expression: missing function reference`);
+           }
            return {
                call: expr.function.$refText,
                args: expr.args.map((arg) => this.getLiteral(arg.value)),
            };
        } else if (this.isAuthMemberAccess(expr)) {
            return {
                authMember: this.getMemberAccessChain(expr),
            };
        } else {
-           throw new Error(`Unsupported default value type for ${expr.$type}`);
+           throw new Error(`Unsupported default value expression type: ${expr.$type} at field ${fieldType.type}`);
        }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
private getMappedValue(
expr: Expression,
fieldType: DataModelFieldType,
): string | number | boolean | unknown[] | { call: string; args: any[] } | { authMember: string[] } | undefined {
if (isLiteralExpr(expr)) {
const lit = (expr as LiteralExpr).value;
return fieldType.type === 'Boolean'
? (lit as boolean)
: ['Int', 'Float', 'Decimal', 'BigInt'].includes(field.type.type!)
: ['Int', 'Float', 'Decimal', 'BigInt'].includes(fieldType.type!)
? Number(lit)
: lit;
} else if (isReferenceExpr(defaultValue) && isEnumField(defaultValue.target.ref)) {
return defaultValue.target.ref.name;
} else if (isInvocationExpr(defaultValue)) {
} else if (isArrayExpr(expr)) {
return expr.items.map((item) => this.getMappedValue(item, fieldType));
} else if (isReferenceExpr(expr) && isEnumField(expr.target.ref)) {
return expr.target.ref.name;
} else if (isInvocationExpr(expr)) {
return {
call: defaultValue.function.$refText,
args: defaultValue.args.map((arg) => this.getLiteral(arg.value)),
call: expr.function.$refText,
args: expr.args.map((arg) => this.getLiteral(arg.value)),
};
} else if (this.isAuthMemberAccess(defaultValue)) {
} else if (this.isAuthMemberAccess(expr)) {
return {
authMember: this.getMemberAccessChain(defaultValue),
authMember: this.getMemberAccessChain(expr),
};
} else {
throw new Error(`Unsupported default value type for field ${field.name}`);
throw new Error(`Unsupported default value type for ${expr.$type}`);
}
}
private getMappedValue(
expr: Expression,
fieldType: DataModelFieldType,
): string | number | boolean | unknown[] | { call: string; args: any[] } | { authMember: string[] } | undefined {
if (isLiteralExpr(expr)) {
const lit = (expr as LiteralExpr).value;
return fieldType.type === 'Boolean'
? (lit as boolean)
: ['Int', 'Float', 'Decimal', 'BigInt'].includes(fieldType.type!)
? Number(lit)
: lit;
} else if (isArrayExpr(expr)) {
return expr.items.map((item) => this.getMappedValue(item, fieldType));
} else if (isReferenceExpr(expr) && isEnumField(expr.target.ref)) {
return expr.target.ref.name;
} else if (isInvocationExpr(expr)) {
if (!expr.function.ref) {
throw new Error(`Invalid invocation expression: missing function reference`);
}
return {
call: expr.function.$refText,
args: expr.args.map((arg) => this.getLiteral(arg.value)),
};
} else if (this.isAuthMemberAccess(expr)) {
return {
authMember: this.getMemberAccessChain(expr),
};
} else {
throw new Error(
`Unsupported default value expression type: ${expr.$type} at field ${fieldType.type}`
);
}
}
🤖 Prompt for AI Agents
In packages/sdk/src/ts-schema-generator.ts between lines 450 and 477, enhance
error handling in the getMappedValue method by first checking if
expr.function.ref exists before accessing expr.function.$refText to avoid
potential runtime errors. Additionally, improve the thrown error message in the
else block to be more descriptive by including details about the unsupported
expression type and context to aid debugging.

Comment on lines +7 to +11
generator client {
provider = "prisma-client-js"
binaryTargets = ["native", "debian-openssl-1.1.x"]
previewFeatures = ["tracing", "metrics"]
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Verify the OpenSSL version compatibility.

The binary target debian-openssl-1.1.x refers to an older OpenSSL version. Modern Debian systems typically use OpenSSL 3.x. This could cause compatibility issues or security vulnerabilities.


🌐 Web query:

What is the latest recommended Prisma binary target for Debian systems with current OpenSSL versions?

💡 Result:

To ensure compatibility between Prisma and your Debian system's OpenSSL version, it's essential to specify the correct binary target in your schema.prisma file. Prisma provides binary targets tailored for different OpenSSL versions on Debian systems.

Identifying Your OpenSSL Version:

First, determine the OpenSSL version installed on your Debian system by running:

openssl version

This command will output the OpenSSL version, such as OpenSSL 3.0.10 1 Aug 2023.

Configuring Prisma with the Appropriate Binary Target:

Based on your OpenSSL version, update the binaryTargets in your schema.prisma file as follows:

  • For OpenSSL 3.0.x:
  generator client {
    provider      = "prisma-client-js"
    binaryTargets = ["native", "debian-openssl-3.0.x"]
  }
  • For OpenSSL 3.2.x:
  generator client {
    provider      = "prisma-client-js"
    binaryTargets = ["native", "debian-openssl-3.2.x"]
  }

After updating the schema.prisma file, regenerate the Prisma Client by running:

npx prisma generate

This configuration ensures that Prisma uses the correct query engine binary compatible with your system's OpenSSL version.

Additional Considerations:

  • Prisma Version: Ensure you're using a Prisma version that supports your OpenSSL version. For instance, Prisma added support for OpenSSL 3.0.x in version 3.13.0. (stackoverflow.com)

  • System Compatibility: If you encounter issues related to OpenSSL versions, consider updating your Prisma and OpenSSL packages to the latest stable releases.

By configuring Prisma with the appropriate binary target matching your Debian system's OpenSSL version, you can achieve optimal compatibility and performance.

Citations:


Update Prisma binaryTargets to match your Debian system’s OpenSSL 3.x

Your schema currently pins an older OpenSSL 1.1 binary target, which can lead to compatibility or security issues on modern Debian releases. Please:

  • Check your host’s OpenSSL version:

    openssl version
  • In tests/e2e/trigger.dev/schema.zmodel, replace debian-openssl-1.1.x with the matching 3.x target:

    generator client {
        provider        = "prisma-client-js"
    -   binaryTargets   = ["native", "debian-openssl-1.1.x"]
    +   binaryTargets   = [
    +     "native",
    +     "debian-openssl-3.0.x"    // use "debian-openssl-3.2.x" if your system reports OpenSSL 3.2.x
    +   ]
        previewFeatures = ["tracing", "metrics"]
    }
  • Run npx prisma generate to regenerate the client.

  • Ensure you’re on Prisma ≥ 3.13.0 to support OpenSSL 3.x.

🤖 Prompt for AI Agents
In tests/e2e/trigger.dev/schema.zmodel around lines 7 to 11, the Prisma
generator's binaryTargets includes "debian-openssl-1.1.x", which is outdated for
modern Debian systems using OpenSSL 3.x. Check your system's OpenSSL version
with "openssl version" and update the binaryTargets to the appropriate OpenSSL
3.x target (e.g., "debian-openssl-3.0.x"). After updating, run "npx prisma
generate" to regenerate the client and ensure your Prisma version is at least
3.13.0 to support OpenSSL 3.x.

Comment on lines +365 to +416
welcomeCard Json @default("{\"enabled\": false}")
/// [SurveyQuestions]
questions Json @default("[]")
/// [SurveyEnding]
endings Json[] @default([])
thankYouCard Json? //deprecated
/// [SurveyHiddenFields]
hiddenFields Json @default("{\"enabled\": false}")
/// [SurveyVariables]
variables Json @default("[]")
responses Response[]
displayOption displayOptions @default(displayOnce)
recontactDays Int?
displayLimit Int?
triggers SurveyTrigger[]
/// [SurveyInlineTriggers]
inlineTriggers Json?
attributeFilters SurveyAttributeFilter[]
displays Display[]
autoClose Int?
autoComplete Int?
delay Int @default(0)
runOnDate DateTime?
closeOnDate DateTime?
/// [SurveyClosedMessage]
surveyClosedMessage Json?
segmentId String?
segment Segment? @relation(fields: [segmentId], references: [id])

/// [SurveyProjectOverwrites]
projectOverwrites Json?

/// [SurveyStyling]
styling Json?

/// [SurveySingleUse]
singleUse Json? @default("{\"enabled\": false, \"isEncrypted\": true}")

/// [SurveyVerifyEmail]
verifyEmail Json? // deprecated
isVerifyEmailEnabled Boolean @default(false)
isSingleResponsePerEmailEnabled Boolean @default(false)
isBackButtonHidden Boolean @default(false)
pin String?
resultShareKey String? @unique
displayPercentage Decimal?
languages SurveyLanguage[]
showLanguageSwitch Boolean?
documents Document[]
followUps SurveyFollowUp[]
/// [SurveyRecaptcha]
recaptcha Json? @default("{\"enabled\": false, \"threshold\":0.1}")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consider implementing a deprecation strategy for legacy fields.

The schema contains deprecated fields (thankYouCard, verifyEmail) that should be removed. Additionally, the extensive use of Json fields bypasses Prisma's type safety. Consider:

  1. Creating a migration plan to remove deprecated fields
  2. Using Prisma's type-safe approach with proper relations instead of JSON where feasible
  3. If JSON is necessary, ensure strict validation at the application layer
🤖 Prompt for AI Agents
In tests/e2e/formbricks/schema.zmodel between lines 365 and 416, deprecated
fields thankYouCard and verifyEmail should be removed following a clear
migration plan to avoid breaking changes. Replace Json fields with proper Prisma
relations or typed models wherever possible to leverage type safety. For Json
fields that must remain, implement strict validation in the application code to
ensure data integrity. This approach improves maintainability and reduces
runtime errors.


generator client {
provider = "prisma-client-js"
previewFeatures = ["postgresqlExtensions"]
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Consider the stability implications of using preview features.

The postgresqlExtensions preview feature may have breaking changes in future Prisma versions. Ensure your team is prepared to handle potential migration issues when upgrading Prisma.


🌐 Web query:

What is the current stability status of the postgresqlExtensions preview feature in Prisma?

💡 Result:

As of June 2025, the postgresqlExtensions preview feature in Prisma has been discontinued. This decision was made due to its complexity and limited value. Instead of this feature, Prisma plans to support specific PostgreSQL extensions, such as PostGIS, through focused, independent efforts. (github.com)

If you're currently using the postgresqlExtensions feature, it's recommended to transition to alternative methods for managing PostgreSQL extensions in your Prisma schema. Prisma's documentation provides guidance on how to represent PostgreSQL extensions in your schema and manage them using Prisma Migrate. (prismagraphql.com)

For more details on the discontinuation and Prisma's roadmap for PostgreSQL extensions, you can refer to the official announcement. (github.com)

Citations:


Replace discontinued postgresqlExtensions preview feature

The postgresqlExtensions preview feature was officially discontinued in June 2025 and will no longer be supported. Please remove this setting and configure any required PostgreSQL extensions (e.g. pgvector, PostGIS) using Prisma Migrate and the documented approach:

• Location: tests/e2e/formbricks/schema.zmodel, line 9
• Change:

generator client {
  provider        = "prisma-client-js"
- previewFeatures = ["postgresqlExtensions"]
+ // removed deprecated preview feature; configure extensions per https://www.prismagraphql.com/docs/orm/prisma-schema/postgresql-extensions
}
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
previewFeatures = ["postgresqlExtensions"]
generator client {
provider = "prisma-client-js"
- previewFeatures = ["postgresqlExtensions"]
+ // removed deprecated preview feature; configure extensions per https://www.prismagraphql.com/docs/orm/prisma-schema/postgresql-extensions
}
🤖 Prompt for AI Agents
In tests/e2e/formbricks/schema.zmodel at line 9, remove the deprecated
previewFeatures setting "postgresqlExtensions" entirely. Instead, configure any
needed PostgreSQL extensions like pgvector or PostGIS through Prisma Migrate
following the official documentation. This means deleting the line
previewFeatures = ["postgresqlExtensions"] and ensuring extension setup is
handled outside this previewFeatures array.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants