api-docs
npx skills add https://github.com/leechael/fastapi-api-docs-skills --skill api-docs
Agent 安装分布
Skill 文档
API Documentation Completion Skill
Analyze FastAPI endpoints and complete their documentation for OpenAPI generation.
IMPORTANT: Before writing any documentation text, read and internalize the anti-slop guidelines in references/anti-slop.md. All copywriting must follow those rules.
Invocation
Single File
/api-docs @path/to/routes/file.py
Batch Mode
/api-docs @teehouse/api/routes/ # All files in directory
/api-docs all api routes # Natural language (finds all route files)
/api-docs @file1.py @file2.py @file3.py # Multiple specific files
When batch mode is detected:
- Use Glob to find all matching
*.pyfiles - Filter to files containing
APIRouteror@router. - Process each file sequentially, showing progress
- Summarize changes at the end
Analysis Workflow
Phase 1: Endpoint Discovery
- Read the target file completely
- Identify all route handlers decorated with
@router.{method}(...):- Extract HTTP method (GET, POST, PUT, PATCH, DELETE)
- Extract path pattern
- Extract existing response_model if present
- Extract existing status_code if present
- Extract existing summary/description if present
Phase 2: Deep Call Stack Analysis
For EACH endpoint, trace the complete execution flow:
2.1 Dependency Analysis (Depends(...))
For each Depends(...) in the handler signature:
-
Locate the dependency function – may be:
- Local function in same file
- Imported from
...auth,...core.database, etc. - A repository/service class method
-
Trace what it provides:
- Database sessions
- Authentication
- Request data validation (Pydantic models)
- Domain objects
-
Document authentication requirements:
- Does it require
get_current_user? â User auth required
- Does it require
-
Track possible HTTP errors from dependencies:
- Auth failures: 401, 403
- Resource not found: 404
- Validation errors: 422
2.2 Handler Body Analysis
Trace the handler’s main logic:
-
Follow function calls – for each called function:
- Read its implementation
- Track what errors it raises (
raise HTTPException,raise ValueError, etc.) - Track what Result types it returns (
is_err(...)) - Recurse into its dependencies
-
Database operations:
- What queries are executed?
- What constraints might fail? (unique violations, foreign keys)
- What models are created/updated/deleted?
-
External service calls:
- Redis operations
- Celery task dispatching
- Third-party API calls
-
Error propagation:
- Explicit
HTTPExceptionraises with status codes - ValueError/Exception catches that convert to HTTP errors
- Result unwrapping that may surface errors
- Explicit
Phase 3: Documentation Generation
Read references/anti-slop.md before writing any text.
For each endpoint, generate/complete:
3.1 Function Docstring
async def handler(...):
"""One-line summary. No "This endpoint..." prefix.
What it does and when to use it. State facts directly.
No hedging ("may", "might", "potentially").
Args:
param_name: What it is. Include format/constraints.
Returns:
What the response contains.
Raises:
HTTPException(400): When X happens.
HTTPException(401): Invalid or missing credentials.
HTTPException(404): Resource not found.
"""
3.2 OpenAPI Metadata
@router.post(
"/path",
response_model=ResponseModel,
status_code=status.HTTP_201_CREATED,
summary="Verb object", # â¤6 words, imperative
description="What it does. When to use it.", # Direct, no filler
responses={
400: {"description": "Invalid input"},
401: {"description": "Not authenticated"},
404: {"description": "Resource not found"},
},
tags=["Category"],
)
3.3 Parameter Documentation
# Path/Query parameters
app_id: str = Path(
...,
description="40-character hex string", # What it is, not what it "represents"
)
# Body parameters
payload: SomeModel = Body(..., description="Configuration to apply")
# Pydantic fields
class RequestModel(BaseModel):
field_name: str = Field(..., description="The X value", min_length=1)
Writing Rules (from references/anti-slop.md)
Summary: â¤6 words, imperative verb
| Bad | Good |
|---|---|
| “This endpoint retrieves…” | “List KMS instances” |
| “Allows users to get…” | “Get encryption key” |
| “Provides functionality for…” | “Update compose file” |
Description: Direct statements
| Bad | Good |
|---|---|
| “This powerful endpoint enables seamless…” | “Encrypt environment variables for deployment.” |
| “It’s important to note that…” | (delete, state the fact) |
| “The identifier serves as…” | “Identifies the KMS by ID or slug.” |
Banned patterns
- “serves as”, “acts as”, “stands as” â use “is”
- “in order to” â “to”
- “It’s worth noting” â (delete)
- “comprehensive”, “robust”, “seamless” â (delete or use plain words)
- Rule of three â pick two or one
- Em-dash reveals â rewrite as normal sentence
Output Format
Single File
- Summary table of endpoints found
- Per-endpoint analysis with call trace
- Apply edits directly
Batch Mode
Processing 5 files...
[1/5] project/api/routes/kms.py
- 4 endpoints analyzed
- 12 fields documented
[2/5] project/api/routes/cvms/tools.py
- 1 endpoint analyzed
- 8 fields documented
...
Summary:
Files processed: 5
Endpoints documented: 23
Fields added: 67
Tracing Depth Limits
- Maximum call depth: 5 levels
- Stop tracing at:
- SQLAlchemy core methods
- Redis primitive operations
- Celery task submission (note as async side effect)
- Standard library functions
Key Patterns to Recognize
Error Patterns
# Direct raise
raise HTTPException(status_code=400, detail="message")
# Result type
if is_err(result):
raise HTTPException(...)
# Exception conversion
except ValueError as err:
raise HTTPException(status_code=422, detail=str(err))
Auth Patterns
current_user: User = Depends(get_current_user) # â 401
Quality Checklist
Before completing each file:
- All endpoints have summary (â¤6 words)
- All endpoints have responses dict
- All Depends traced for auth/errors
- No “serves as” / “acts as” (use “is”)
- No “in order to” (use “to”)
- No “It’s important/worth noting”
- No rule-of-three lists
- Each description adds information beyond the name
- Error descriptions state what happened, not what “may occur”