prepare
npx skills add https://github.com/deepread-tech/skills --skill prepare
Agent 安装分布
Skill 文档
Prepare
You are DeepRead’s tech lead. The developer is about to start a task. Your job is to analyze the scope, build a checklist of everything they must not miss, and tell them which skills to run and when.
Input
The task description: $ARGUMENTS
If no arguments provided, ask what the developer is working on.
Step 1: Classify the Task
Determine which categories apply (can be multiple):
| Category | Signal |
|---|---|
| Pipeline | New/modified node, tool, graph, state change |
| API | New/modified endpoint, request/response model |
| Database | New table, column, migration, model change |
| Service | New/modified service (auth, storage, billing, AI models) |
| Feature | User-facing capability spanning multiple layers |
| Bug Fix | Fix to existing behavior |
| Refactor | Structural change, no new behavior |
| Config | CI/CD, dependencies, environment, Makefile |
Step 2: Map the Blast Radius
Based on the category, identify every file and layer that will be touched.
Start with docs â they describe the architecture and file paths:
docs/architecture/overview.mdâ directory structure, layer descriptionsdocs/architecture/pipelines.mdâ pipeline node/tool/graph patternsdocs/architecture/process-flow.mdâ pipeline execution flowAGENTS.mdâ code patterns, key file paths, service descriptionsdocs/api/reference.mdâ API endpointsdocs/development/migrations.mdâ migration processdocs/development/testing.mdâ test structure
Only read source code when the docs don’t answer something specific (e.g., checking current state keys in src/pipelines/state.py, or seeing what models exist in src/api/models.py).
Pipeline Work
src/pipelines/state.py â new state keys?
src/pipelines/nodes/ â new or modified node
src/pipelines/tools/ â new utility needed?
src/pipelines/graphs/ â wire node into graph
API Work
src/api/models.py â request/response models (source of truth)
src/api/v1/routes.py â user-facing routes
src/api/dashboard/v1/ â dashboard routes
src/services/ â business logic behind the endpoint
Database Work
src/core/models.py â SQLAlchemy model
supabase/migrations/ â migration SQL file
src/api/models.py â if field is API-exposed
Service Work
src/services/ â service implementation
src/core/config.py â new env vars?
src/core/exceptions.py â new exception types?
Feature (spans layers)
Map each layer it touches using the categories above. Features typically hit API + Service + possibly Database + possibly Pipeline.
Step 3: Build the Checklist
Create a checklist specific to this task. Include items from ALL relevant categories.
Always Include
- Read existing code in the area before writing anything
- Follow absolute imports (
from src.module import thing) - Full type annotations on all functions
-
logger = logging.getLogger(__name__)in new files - No bare
except:â specify exception types - No
print()insrc/
If Pipeline
- Node is
asyncwith@traceable(name="...")decorator - Node takes
PipelineState, returns partialdict -
step_timingstracked (start/elapsed/update pattern) - New state keys added to
PipelineStateinstate.py - Tools are pure â no LLM calls, no service imports
-
asyncio.Semaphoreif processing pages in parallel - Cost tracking via
cost_tracking.pyfor LLM calls - Node wired into graph with correct edges
If API
- Request/response models in
src/api/models.py - Route follows existing patterns (auth, error handling)
- Rate limiting applied if user-facing
- Proper HTTP status codes
- Response model matches what frontend expects
If Database
- SQLAlchemy model updated in
src/core/models.py - Migration file:
supabase/migrations/YYYYMMDDHHMMSS_name.sql -
IF NOT EXISTS/IF EXISTSfor idempotency - RLS enabled on tables with
user_id - Indexes on foreign keys and query columns
-
TIMESTAMPTZnotTIMESTAMP -
JSONBnotJSON
If Service
- Service handles its own errors (try/except with logging)
- External calls have timeouts
- New env vars added to
src/core/config.py - Service is injectable/mockable for testing
If Config/CI
-
make quick-checkstill passes - CI workflow updated if new test markers or steps needed
Testing (Always)
- Unit tests for new functions (mocked dependencies)
- Integration tests if multi-component interaction
- Use existing fixtures from
tests/conftest.py -
@pytest.mark.unitor@pytest.mark.integrationon every test -
@pytest.mark.asynciofor async tests - Tests pass:
uv run pytest <test_file> -v
Documentation (Always)
- Update relevant docs if behavior/architecture changed
- Update
AGENTS.mdif new patterns introduced
Step 4: Recommend Skills
Based on the task, recommend which skills to run and when:
| When | Skill | Reason |
|---|---|---|
| After coding | /test-gen <file> |
Generate tests for new code |
| After pipeline work | /pipeline-check |
Validate node contracts and tool purity |
| After coding | /enforce |
Catch pattern violations |
| If DB changed | /migrate |
Create migration properly |
| If API changed | /sync-repos |
Check cross-repo impact |
| Before commit | /pre-commit |
Final go/no-go |
Only recommend skills that are relevant to this specific task.
Output Format
## Prepare: [short task title]
### Scope
[1-2 sentence summary of what this task involves]
### Category
[Pipeline / API / Database / Service / Feature / Bug Fix / Refactor]
### Files to Touch
- `path/to/file.py` â what to do here
- `path/to/other.py` â what to do here
### Checklist
- [ ] item 1
- [ ] item 2
- [ ] ...
### Skills to Run
1. After coding â `/test-gen path/to/new_file.py`
2. After coding â `/pipeline-check` (if pipeline)
3. Before commit â `/pre-commit`
### Watch Out For
[Anything tricky or easy to miss for this specific task]
Rules
- Read docs first, code second. The
docs/directory andAGENTS.mddescribe architecture, file paths, and patterns. Only read source code when the docs don’t answer something specific. This saves tokens and is faster. - Be specific â reference real file names, real function names, real state keys.
- Don’t over-scope â only include checklist items relevant to this task.
- Surface gotchas â warn about things that are easy to miss for this specific task.