|
1 | | -✓ Template instantiation succeeded |
| 1 | +-- |
| 2 | +## Databricks Apps Development |
| 3 | + |
| 4 | +### Validation |
| 5 | +⚠️ Always validate before deploying: |
| 6 | + invoke_databricks_cli 'experimental aitools tools validate ./' |
| 7 | + |
| 8 | +This is battle-tested to catch common issues before deployment. Prefer using this over manual checks (e.g. `npm run lint`), as it covers more ground specific to Databricks Apps. |
| 9 | + |
| 10 | +### Deployment |
| 11 | +⚠️ USER CONSENT REQUIRED: Only deploy with explicit user permission. |
| 12 | + invoke_databricks_cli 'experimental aitools tools deploy' |
| 13 | + |
| 14 | +### View and Manage |
| 15 | + invoke_databricks_cli 'bundle summary' |
| 16 | + |
| 17 | +### View App Logs |
| 18 | +To troubleshoot deployed apps, view their logs: |
| 19 | + invoke_databricks_cli 'apps logs <app-name> --tail-lines 100' |
| 20 | + |
| 21 | +### Local Development vs Deployed Apps |
| 22 | + |
| 23 | +During development: |
| 24 | +- Start template-specific dev server (see project's CLAUDE.md for command and port) |
| 25 | +- Use localhost URL shown when dev server starts |
| 26 | + |
| 27 | +After deployment: |
| 28 | +- Get URL from: invoke_databricks_cli 'bundle summary' |
| 29 | + |
| 30 | +Decision tree: |
| 31 | +- "open the app" + not deployed → localhost |
| 32 | +- "open the app" + deployed → ask which environment |
| 33 | +- "localhost"/"local" → always localhost |
| 34 | + |
| 35 | + |
| 36 | +## Skills |
| 37 | + |
| 38 | +You have access to modular Skills for domain-specific expertise knowledge. |
| 39 | + |
| 40 | +### Skill Selection & Loading |
| 41 | +* When a user request matches a skill's scope description, select that Skill |
| 42 | +* Load skills using the MCP tool: `read_skill_file(file_path: "category/skill-name/SKILL.md")` |
| 43 | +* Example: `read_skill_file(file_path: "pipelines/materialized-view/SKILL.md")` |
| 44 | +* Skills may contain links to sub-sections (e.g., "category/skill-name/file.md") |
| 45 | +* If no Skill is suitable, continue with your base capabilities |
| 46 | +* Never mention or reference skills to the user, only use them internally |
| 47 | + |
| 48 | +### Skill Registry (names + brief descriptors) |
| 49 | + |
| 50 | + |
| 51 | +**Note**: The following skills are for other resource types and may not be directly relevant to this project. |
| 52 | + |
| 53 | +* **pipelines/auto-cdc/SKILL.md**: Apply Change Data Capture (CDC) with apply_changes API in Spark Declarative Pipelines. Use when user needs to process CDC feeds from databases, handle upserts/deletes, maintain slowly changing dimensions (SCD Type 1 and Type 2), synchronize data from operational databases, or process merge operations. |
| 54 | + |
| 55 | + |
| 56 | + |
| 57 | +=== CLAUDE.md === |
| 58 | +TypeScript full-stack template powered by **Databricks AppKit** with tRPC for additional custom API endpoints. |
| 59 | + |
| 60 | +- server/: Node.js backend with App Kit and tRPC |
| 61 | +- client/: React frontend with App Kit hooks and tRPC client |
| 62 | +- config/queries/: SQL query files for analytics |
| 63 | +- shared/: Shared TypeScript types |
| 64 | +- docs/: Detailed documentation on using App Kit features |
| 65 | + |
| 66 | +## Quick Start: Your First Query & Chart |
| 67 | + |
| 68 | +Follow these 3 steps to add data visualization to your app: |
| 69 | + |
| 70 | +**Step 1: Create a SQL query file** |
| 71 | + |
| 72 | +```sql |
| 73 | +-- config/queries/my_data.sql |
| 74 | +SELECT category, COUNT(*) as count, AVG(value) as avg_value |
| 75 | +FROM my_table |
| 76 | +GROUP BY category |
| 77 | +``` |
| 78 | + |
| 79 | +**Step 2: Define the schema** |
| 80 | + |
| 81 | +```typescript |
| 82 | +// config/queries/schema.ts |
| 83 | +export const querySchemas = { |
| 84 | + my_data: z.array( |
| 85 | + z.object({ |
| 86 | + category: z.string(), |
| 87 | + count: z.number(), |
| 88 | + avg_value: z.number(), |
| 89 | + }) |
| 90 | + ), |
| 91 | +}; |
| 92 | +``` |
| 93 | + |
| 94 | +**Step 3: Add visualization to your app** |
| 95 | + |
| 96 | +```typescript |
| 97 | +// client/src/App.tsx |
| 98 | +import { BarChart } from '@databricks/appkit-ui/react'; |
| 99 | + |
| 100 | +<BarChart queryKey="my_data" parameters={{}} /> |
| 101 | +``` |
| 102 | + |
| 103 | +**That's it!** The component handles data fetching, loading states, and rendering automatically. |
| 104 | + |
| 105 | +**To refresh TypeScript types after adding queries:** |
| 106 | +- Run `npm run typegen` OR run `npm run dev` - both auto-generate type definitions in `client/src/appKitTypes.d.ts` |
| 107 | +- DO NOT manually edit `appKitTypes.d.ts` |
| 108 | + |
| 109 | +## Installation |
| 110 | + |
| 111 | +**IMPORTANT**: When running `npm install`, always use `required_permissions: ['all']` to avoid sandbox permission errors. |
| 112 | + |
| 113 | +## NPM Scripts |
| 114 | + |
| 115 | +### Development |
| 116 | +- `npm run dev` - Start dev server with hot reload (**ALWAYS use during development**) |
| 117 | + |
| 118 | +### Testing and Code Quality |
| 119 | +See the databricks experimental aitools tools validate instead of running these individually. |
| 120 | + |
| 121 | +### Utility |
| 122 | +- `npm run clean` - Remove all build artifacts and node_modules |
| 123 | + |
| 124 | +**Common workflows:** |
| 125 | +- Development: `npm run dev` → make changes → `npm run typecheck` → `npm run lint:fix` |
| 126 | +- Pre-deploy: Validate with `databricks experimental aitools tools validate .` |
| 127 | + |
| 128 | +## Documentation |
| 129 | + |
| 130 | +**IMPORTANT**: Read the relevant docs below before implementing features. They contain critical information about common pitfalls (e.g., SQL numeric type handling, schema definitions, Radix UI constraints). |
| 131 | + |
| 132 | +- [SQL Queries](docs/sql-queries.md) - query files, schemas, type handling, parameterization |
| 133 | +- [App Kit SDK](docs/appkit-sdk.md) - TypeScript imports, server setup, useAnalyticsQuery hook |
| 134 | +- [Frontend](docs/frontend.md) - visualization components, styling, layout, Radix constraints |
| 135 | +- [tRPC](docs/trpc.md) - custom endpoints for non-SQL operations (mutations, Databricks APIs) |
| 136 | +- [Testing](docs/testing.md) - vitest unit tests, Playwright smoke/E2E tests |
| 137 | + |
| 138 | +================= |
| 139 | + |
0 commit comments