deepdive
2
总安装量
1
周安装量
#70769
全站排名
安装命令
npx skills add https://github.com/tosi-n/deepdive --skill deepdive
Agent 安装分布
mcpjam
1
openhands
1
junie
1
windsurf
1
crush
1
Skill 文档
DeepDive – Universal Data Agent
DeepDive transforms natural language into database queries, generates visualizations, and learns from user corrections to improve over time.
Quick Start
# Setup (creates .deepdive/ directory)
@deepdive init
# Configure database in .deepdive/.env
DATABASE_URL=postgresql://user:pass@localhost:5432/db
# Query data
@deepdive query "show top 10 customers by revenue"
# Visualize schema
@deepdive visualize schema
# Create chart
@deepdive chart "monthly revenue over last 6 months"
Core Commands
Database Connection
@deepdive init– Initialize .deepdive/ directory with .env template@deepdive connect <type>– Setup connection (postgres|mysql|sqlite|bigquery|snowflake)
Natural Language Queries
@deepdive query "<question>"– Convert question to SQL and execute@deepdive preview "<query>"– Show results as markdown table (limit 100 rows)
Visualization
@deepdive visualize schema– Generate ERD diagram (.deepdive/diagrams/)@deepdive visualize lineage– Show table relationships@deepdive chart "<question>"– Generate Vega-Lite chart (.deepdive/charts/)
Learning & Safety
@deepdive learn– View/update learned corrections@deepdive history– Show recent queries@deepdive safe-mode [on|off]– Require confirmation for writes (default: on)
Project Structure
DeepDive creates and manages:
.deepdive/
âââ .env # Database credentials (user-managed)
âââ memory.json # Learned corrections (per-project)
âââ diagrams/ # Generated .mmd files
â âââ schema-YYYYMMDD.mmd
â âââ erd-YYYYMMDD.mmd
âââ charts/ # Generated .png/.svg files
â âââ chart-XXX.png
âââ queries.log # Query history
Supported Databases
- PostgreSQL – Full support with advanced features
- MySQL – Standard SQL support
- SQLite – File-based, perfect for local/dev
- BigQuery – Google Cloud, large-scale analytics
- Snowflake – Cloud data warehouse
- Redshift – AWS analytics
Reference Documentation
Read these files based on the task:
- Database Connections: references/connectors.md
- Natural Language Queries: references/nl-to-sql.md
- Schema Introspection: references/schema-introspection.md
- Mermaid Visualization: references/mermaid-viz.md
- Vega-Lite Charts: references/vega-charts.md
- User Learning: references/user-learning.md
- Write Protection: references/write-protection.md
- Examples: references/examples.md
Usage Patterns
Data Exploration
User: "What tables are in this database?"
â @deepdive schema introspection
User: "Show me the customer table structure"
â @deepdive schema customers
Querying
User: "Which customers bought something last month?"
â Natural language â SQL â Execute â Results
User: "Chart monthly revenue"
â Query â Vega-Lite spec â .deepdive/charts/revenue.png
Visualization
User: "Visualize the database schema"
â Mermaid ERD â .deepdive/diagrams/schema.mmd â Open browser
User: "Show relationships between tables"
â Foreign key analysis â Lineage diagram
Key Principles
- Environment Variables: All credentials in
.deepdive/.env(never hardcoded) - Write Protection: INSERT/UPDATE/DELETE require explicit confirmation (unless safe-mode off)
- Learning: Corrections stored in
.deepdive/memory.jsonand applied to future queries - Project Scope: Each project has isolated memory, diagrams, and charts
- Static Outputs: All visualizations are files (.mmd, .png) for version control
Examples
RevOps Scenario
@deepdive connect postgres
@deepdive query "qualified opportunities by stage this quarter"
@deepdive chart "conversion funnel"
@deepdive visualize lineage
Video Production Scenario
@deepdive connect sqlite # For document analysis
@deepdive query "projects due this week from documents table"
@deepdive chart "project timeline"
Scripts
Python scripts for reliable operations:
scripts/generate_mermaid.py– Generate schema/ERD diagramsscripts/generate_chart.py– Create Vega-Lite chartsscripts/validate_query.py– SQL safety validation
Execute scripts rather than rewriting code for deterministic results.
Safety & Privacy
- Read-only by default for exploration
- Write operations require confirmation
- Credentials never logged or shared
- All data stays local (no cloud API calls)
- Query history stored locally only