databricks-apps
npx skills add https://github.com/databricks/databricks-agent-skills --skill databricks-apps
Agent 安装分布
Skill 文档
Databricks Apps Development
FIRST: Use the parent databricks skill for CLI basics, authentication, and profile selection.
Build apps that deploy to Databricks Apps platform.
Required Reading by Phase
| Phase | READ BEFORE proceeding |
|---|---|
| Scaffolding | Parent databricks skill (auth, warehouse discovery); run databricks apps manifest and use its plugins/resources to build databricks apps init with --features and --set (see AppKit section below) |
| Writing SQL queries | SQL Queries Guide |
| Writing UI components | Frontend Guide |
Using useAnalyticsQuery |
AppKit SDK |
| Adding API endpoints | tRPC Guide |
Generic Guidelines
These apply regardless of framework:
- Deployment:
databricks apps deploy --profile <PROFILE>(â ï¸ USER CONSENT REQUIRED) - Validation:
databricks apps validate --profile <PROFILE>before deploying - App name: Must be â¤26 characters, lowercase letters/numbers/hyphens only (no underscores). dev- prefix adds 4 chars, max 30 total.
- Smoke tests: ALWAYS update
tests/smoke.spec.tsselectors BEFORE running validation. Default template checks for “Minimal Databricks App” heading and “hello world” text â these WILL fail in your custom app. See testing guide. - Authentication: covered by parent
databricksskill
Project Structure (after databricks apps init --features analytics)
client/src/App.tsxâ main React component (start here)config/queries/*.sqlâ SQL query files (queryKey = filename without .sql)server/server.tsâ backend entry (tRPC routers)tests/smoke.spec.tsâ smoke test (â ï¸ MUST UPDATE selectors for your app)client/src/appKitTypes.d.tsâ auto-generated types (npm run typegen)
Data Discovery
Before writing any SQL, use the parent databricks skill for data exploration â search information_schema by keyword, then batch discover-schema for the tables you need. Do NOT skip this step.
Development Workflow (FOLLOW THIS ORDER)
- Create SQL files in
config/queries/ - Run
npm run typegenâ verify all queries show â - Read
client/src/appKitTypes.d.tsto see generated types - THEN write
App.tsxusing the generated types - Update
tests/smoke.spec.tsselectors - Run
databricks apps validate --profile <PROFILE>
DO NOT write UI code before running typegen â types won’t exist and you’ll waste time on compilation errors.
When to Use What
- Read data â display in chart/table: Use visualization components with
queryKeyprop - Read data â custom display (KPIs, cards): Use
useAnalyticsQueryhook - Read data â need computation before display: Still use
useAnalyticsQuery, transform client-side - Call ML model endpoint: Use tRPC
- Write/update data (INSERT/UPDATE/DELETE): Use tRPC
- â ï¸ NEVER use tRPC to run SELECT queries â always use SQL files in
config/queries/
Frameworks
AppKit (Recommended)
TypeScript/React framework with type-safe SQL queries and built-in components.
Official Documentation â the source of truth for all API details:
npx @databricks/appkit docs # â ALWAYS start here to see available pages
npx @databricks/appkit docs <path> # then use paths from the index
DO NOT guess doc paths. Run without args first, pick from the index. Docs are the authority on component props, hook signatures, and server APIs â skill files only cover anti-patterns and gotchas.
App Manifest and Scaffolding
Agent workflow for scaffolding: get the manifest first, then build the init command.
-
Get the manifest (JSON schema describing plugins and their resources):
databricks apps manifest --profile <PROFILE> # Custom template: databricks apps manifest --template <GIT_URL> --profile <PROFILE>The output defines:
- Plugins: each has a key (plugin ID for
--features), plusrequiredByTemplate, andresources. - requiredByTemplate: If true, that plugin is mandatory for this template â do not add it to
--features(it is included automatically); you must still supply all of its required resources via--set. If false or absent, the plugin is optional â add it to--featuresonly when the user’s prompt indicates they want that capability (e.g. analytics/SQL), and then supply its required resources via--set. - Resources: Each plugin has
resources.requiredandresources.optional(arrays). Each item hasresourceKeyandfields(object: field name â description/env). Use--set <plugin>.<resourceKey>.<field>=<value>for each required resource field of every plugin you include.
- Plugins: each has a key (plugin ID for
-
Scaffold (DO NOT use
npx; use the CLI only):databricks apps init --name <NAME> --features <plugin1>,<plugin2> \ --set <plugin1>.<resourceKey>.<field>=<value> \ --set <plugin2>.<resourceKey>.<field>=<value> \ --description "<DESC>" --run none --profile <PROFILE> # --run none: skip auto-run after scaffolding (review code first) # With custom template: databricks apps init --template <GIT_URL> --name <NAME> --features ... --set ... --profile <PROFILE>- Required:
--name,--profile. Name: â¤26 chars, lowercase letters/numbers/hyphens only. Use--featuresonly for optional plugins the user wants (plugins withrequiredByTemplate: falseor absent); mandatory plugins must not be listed in--features. - Resources: Pass
--setfor every required resource (each field inresources.required) for (1) all plugins withrequiredByTemplate: true, and (2) any optional plugins you added to--features. Add--setforresources.optionalonly when the user requests them. - Discovery: Use the parent
databricksskill to resolve IDs (e.g. warehouse:databricks warehouses list --profile <PROFILE>ordatabricks experimental aitools tools get-default-warehouse --profile <PROFILE>).
- Required:
DO NOT guess plugin names, resource keys, or property names â always derive them from databricks apps manifest output. Example: if the manifest shows plugin analytics with a required resource resourceKey: "sql-warehouse" and fields: { "id": ... }, include --set analytics.sql-warehouse.id=<ID>.
READ AppKit Overview for project structure, workflow, and pre-implementation checklist.
Common Scaffolding Mistakes
# â WRONG: name is NOT a positional argument
databricks apps init --features analytics my-app-name
# â "unknown command" error
# â
CORRECT: use --name flag
databricks apps init --name my-app-name --features analytics --set "..." --profile <PROFILE>
Directory Naming
databricks apps init creates directories in kebab-case matching the app name.
App names must be lowercase with hyphens only (â¤26 chars).
Other Frameworks
Databricks Apps supports any framework that can run as a web server (Flask, FastAPI, Streamlit, Gradio, etc.). Use standard framework documentation – this skill focuses on AppKit.