quality-verify-integration
4
总安装量
4
周安装量
#48393
全站排名
安装命令
npx skills add https://github.com/dawiddutoit/custom-claude --skill quality-verify-integration
Agent 安装分布
mcpjam
4
neovate
4
gemini-cli
4
antigravity
4
windsurf
4
zencoder
4
Skill 文档
Verify Integration
Purpose
Prevent “done but not integrated” failures where:
- Code exists â
- Unit tests pass â
- Quality gates pass â
- But code is never called at runtime â
When to Use
MANDATORY before:
- Moving ADR to
3_completed/ - Marking todo tasks complete for integration work
- Claiming a feature is “done” or “working”
- Creating a PR for integration changes
The CCV Principle
COMPLETE = CREATION + CONNECTION + VERIFICATION
| Phase | What It Proves | Required Evidence |
|---|---|---|
| CREATION | Artifact exists | File, tests, types |
| CONNECTION | Wired into system | Import, registration |
| VERIFICATION | Works at runtime | Logs, output |
Missing any phase = NOT complete
The Four Questions Test
Before “done”, answer ALL FOUR:
- How do I trigger this? (entry point)
- What connects it to the system? (import/registration)
- What proves it runs? (logs/traces)
- What shows it works? (outcome)
Cannot answer all four? â NOT COMPLETE
Quick Verification
Step 1: Check for Orphaned Modules
# Run orphan detection script
./scripts/verify_integration.sh
# Or manual check for specific module
grep -r "from.*module_name import\|import.*module_name" src/ --include="*.py" | grep -v test
If no matches â Module NOT integrated
Step 2: Verify Call-Sites
# Check if function/class is actually called
grep -r "function_name\|ClassName" src/ --include="*.py" | grep -v "^def \|^class " | grep -v test
If no matches â Code never called
Step 3: Check Context-Specific Integration
LangGraph Nodes:
grep -n "from.*nodes import\|add_node.*name" src/temet_run/coordination/graph/builder.py
DI Services:
grep -n "providers.Singleton\|providers.Factory" src/temet_run/container.py
CLI Commands:
grep -n "add_command\|@app.command" src/temet_run/cli/app.py
Step 4: Demand Runtime Proof
Require one of:
- Integration test output showing component exists
- Logs from execution showing code ran
- State inspection showing fields populated
Connection Patterns
See references/connection-patterns.md for:
- LangGraph node integration
- Dependency Injection patterns
- CLI command registration
- API endpoint wiring
- Configuration loading
Verification Report Template
## Integration Verification: [Feature Name]
### CCV Status
**CREATION:** â
/ â
- Files: [list]
- Tests: [count] passing
- Types: mypy passes
**CONNECTION:** â
/ â
- Import location: [file:line]
- Registration: [how wired]
- Entry point: [how triggered]
**VERIFICATION:** â
/ â
- Integration test: [pass/fail]
- Runtime logs: [attached/missing]
- Expected outcome: [observed/not observed]
### Four Questions
1. Trigger: [answer or UNANSWERED]
2. Connection: [answer or UNANSWERED]
3. Execution proof: [answer or UNANSWERED]
4. Outcome proof: [answer or UNANSWERED]
### Verdict
**APPROVED â
** â All phases complete, evidence attached
OR
**BLOCKED â** â Missing: [list what's missing]
Supporting Files
- references/connection-patterns.md – Integration patterns by artifact type
Success Criteria
- No orphaned modules (verify_integration.sh passes)
- All imports verified with grep
- All call-sites verified
- Context-specific checks passed
- Four Questions answered
- Runtime proof attached