solo-humanize
npx skills add https://github.com/fortunto2/solo-factory --skill solo-humanize
Agent 安装分布
Skill 文档
/humanize
Strip AI writing patterns from user-facing text. Takes a file or pasted text and rewrites it to read like a human wrote it, without losing meaning or structure.
Why this exists
LLM output has recognizable tells â em dashes, stock phrases, promotional inflation, performed authenticity. Readers (and Google) notice. This skill catches those patterns and rewrites them.
When to use
- After
/content-gen,/landing-gen,/video-promoâ polish the output - Before publishing any user-facing prose (blog posts, landing pages, emails)
- When editing CLAUDE.md or docs that will be read by humans
- Standalone:
/humanize path/to/file.md
Input
- File path from
$ARGUMENTSâ reads and rewrites in place - No argument â asks to paste text, outputs cleaned version
- Works on
.md,.txt, and text content in.tsx/.html(string literals only)
Pattern Catalog
1. Em Dash Overuse (â)
The most obvious AI tell. Replace with commas, periods, colons, or restructure the sentence.
| Before | After |
|---|---|
| “The tool â which is free â works great” | “The tool (which is free) works great” |
| “Three features â speed, security, simplicity” | “Three features: speed, security, simplicity” |
| “We built this â and it changed everything” | “We built this. It changed everything.” |
Rule: Max 1 em dash per 500 words. Zero is better.
2. Stock Phrases
Phrases that signal “AI wrote this.” Remove or replace with specific language.
Filler phrases (delete entirely):
- “it’s worth noting that” â (just state the thing)
- “at the end of the day” â (cut)
- “in today’s world” / “in the modern landscape” â (cut)
- “without further ado” â (cut)
- “let’s dive in” / “let’s explore” â (cut)
Promotional inflation (replace with specifics):
- “game-changer” â what specifically changed?
- “revolutionary” â what’s actually new?
- “cutting-edge” â describe the technology
- “seamless” â “works without configuration” (or whatever it actually does)
- “leverage” â “use”
- “robust” â “handles X edge cases” (specific)
- “streamline” â “cut steps from N to M”
- “empower” â what can the user now do?
- “unlock” â what’s the actual capability?
Performed authenticity (rewrite):
- “to be honest” â (if you need to say this, the rest wasn’t honest?)
- “let me be frank” â (just be frank)
- “I have to say” â (just say it)
- “honestly” â (cut)
- “the truth is” â (cut, state the truth directly)
3. Rule of Three
AI loves triplets: “fast, secure, and scalable.” Real writing varies list length.
| Before | After |
|---|---|
| “Fast, secure, and scalable” | “Fast and secure” (if scalable isn’t proven) |
| “Build, deploy, and iterate” | “Build and ship” (if that’s what you mean) |
| Three bullet points that all say the same thing | One clear bullet |
Rule: If you find 3+ triplet lists in one document, break at least half of them.
4. Structural Patterns
Every section has the same shape: AI tends to write: heading â one-sentence intro â 3 bullets â transition sentence. Real writing varies section length and structure.
Hedging sandwich: “While X has limitations, it offers Y, making it Z.” â Pick a side. State it.
False balance: “On one hand X, on the other hand Y.” â If one side is clearly better, say so.
5. Sycophantic Openers
- “Great question!” â (cut)
- “That’s a fantastic idea!” â (cut, or say what’s specifically good about it)
- “Absolutely!” â (cut if not genuine agreement)
- “I’d be happy to help!” â (just help)
6. Passive Voice / Weak Verbs
- “It should be noted that” â (cut, just note it)
- “There are several factors that” â name the factors
- “It is important to” â say why
- “This can be achieved by” â “Do X”
Process
-
Read the input â file path or pasted text.
-
Scan for patterns â check each category above. Count violations per category.
-
Rewrite â fix each violation while preserving:
- Technical accuracy (don’t change code, commands, or technical terms)
- Structure (headings, lists, code blocks stay)
- Tone intent (if the original was casual, keep it casual)
- Length (aim for same or shorter, never longer)
-
Report what changed:
Humanized: {file or "pasted text"} Changes: Em dashes: {N} removed Stock phrases: {N} replaced Inflation: {N} deflated Triplets: {N} broken Sycophancy: {N} cut Total: {N} patterns fixed Before: {word count} After: {word count} -
If file path: write the cleaned version back. Show a diff summary. If pasted text: output the cleaned version directly.
What NOT to change
- Code blocks and inline code
- Technical terms, library names, CLI commands
- Quotes from other people (attributed quotes stay verbatim)
- Numbers, dates, URLs
- Headings structure (don’t merge or split sections)
- Content meaning â only rephrase, never add or remove ideas
Edge Cases
- Short text (<50 words): just apply stock phrase filter, skip structural analysis
- Already clean: report “No AI patterns found. Text looks human.”
- Code-heavy docs: skip code blocks entirely, only process prose sections
- Non-English text: apply em dash and structural rules (they’re universal), skip English stock phrases