notion-clipper-skill
npx skills add https://github.com/ewingyangs/ewing-skills --skill notion-clipper-skill
Agent 安装分布
Skill 文档
Notion Clipper
Clip any web page to Notion. Uses Chrome CDP for full JavaScript rendering, converts to Markdown, then to Notion blocks.
Prerequisites
- Notion API Key: Create an integration at https://notion.so/my-integrations
- Store the key:
mkdir -p ~/.config/notion
echo "ntn_your_key_here" > ~/.config/notion/api_key
- Share target database/page with your integration (click “…” â “Connect to” â your integration name)
First Time Setup
Dependencies are auto-installed when the script runs. No manual setup needed.
Agent Execution Instructions
CRITICAL: Always use the command pattern below. It auto-installs dependencies on first run.
- Determine this SKILL.md file’s directory path as
SKILL_DIR - Command pattern (package.json in
scripts/; always run lazy install first):
(cd "${SKILL_DIR}/scripts" && (test -d node_modules || npm install) && npx -y tsx main.ts <args>)
- Replace
${SKILL_DIR}with the actual path (e.g./Users/xxx/.claude/skills/notion-clipper-skill)
Usage
IMPORTANT – Use this command pattern for best results:
# Recommended: Clear proxy env vars and use tsx runtime
(cd "${SKILL_DIR}/scripts" && (test -d node_modules || npm install) && unset http_proxy https_proxy all_proxy && npx -y tsx main.ts <url> --database-name "Resources")
Why this pattern?
unset http_proxy https_proxy all_proxy– Avoids ECONNREFUSED from proxy conflictstsxruntime – Node.js runtime that properly handles direct connections (bun has proxy issues)(test -d node_modules || npm install)– Auto-installs dependencies if missing
If you encounter network issues:
- Close any VPN/proxy software
- Switch to a stable network (mobile hotspot often works)
- Use the recommended command pattern above
# Clip to a Notion database by NAME (recommended - searches for database)
(cd "${SKILL_DIR}/scripts" && (test -d node_modules || npm install) && npx -y tsx main.ts <url> --database-name "Resource")
# Clip to a Notion database by ID
(cd "${SKILL_DIR}/scripts" && (test -d node_modules || npm install) && npx -y tsx main.ts <url> --database <database_id>)
# Clip to an existing page (appends blocks)
(cd "${SKILL_DIR}/scripts" && (test -d node_modules || npm install) && npx -y tsx main.ts <url> --page <page_id>)
# List all accessible databases
(cd "${SKILL_DIR}/scripts" && (test -d node_modules || npm install) && npx -y tsx main.ts --list-databases)
# For pages requiring login (wait mode)
(cd "${SKILL_DIR}/scripts" && (test -d node_modules || npm install) && npx -y tsx main.ts <url> --database-name "Resource" --wait)
Options
| Option | Description |
|---|---|
<url> |
URL to clip |
--database-name, -n <name> |
Target database by name (searches for match) |
--database, -d <id> |
Target Notion database by ID |
--page, -p <id> |
Target Notion page ID (appends blocks) |
--list-databases, -l |
List all accessible databases and exit |
--wait, -w |
Wait for user signal before capturing |
--timeout, -t <ms> |
Page load timeout (default: 30000) |
--no-bookmark |
Don’t include bookmark block at top |
Capture Modes
| Mode | Behavior | Use When |
|---|---|---|
| Auto (default) | Capture on network idle | Public pages, static content |
Wait (--wait) |
User signals when ready | Login-required, lazy loading, paywalls |
Wait mode workflow:
- Run with
--waitâ Chrome opens, script outputs “Press Enter when ready” - Log in or navigate as needed in the browser
- Press Enter in terminal to trigger capture
Output Structure
When saving to a database, creates a new page with:
- Name: Page title
- URL: Source URL (if database has URL property)
- Content: Bookmark block + converted content blocks
When appending to a page, adds:
- Bookmark block (link to source)
- Divider
- Converted content blocks
Database Setup
For best results, create a Notion database with these properties:
- Name (Title) – Required, will contain page title
- URL (URL) – Optional, will contain source URL
Examples
Clip a tweet to “Resource” database (by name):
(cd "${SKILL_DIR}/scripts" && (test -d node_modules || npm install) && unset http_proxy https_proxy all_proxy && npx -y tsx main.ts "https://x.com/dotey/status/123456" -n "Resource")
List all databases first:
(cd "${SKILL_DIR}/scripts" && (test -d node_modules || npm install) && unset http_proxy https_proxy all_proxy && npx -y tsx main.ts --list-databases)
Clip article requiring login:
(cd "${SKILL_DIR}/scripts" && (test -d node_modules || npm install) && unset http_proxy https_proxy all_proxy && npx -y tsx main.ts "https://medium.com/article" -n "Reading" --wait)
Append to reading notes page:
(cd "${SKILL_DIR}/scripts" && (test -d node_modules || npm install) && unset http_proxy https_proxy all_proxy && npx -y tsx main.ts "https://blog.example.com/post" -p xyz789)
Quick alias (add to your ~/.bashrc or ~/.zshrc):
alias notion-clip='(cd "${SKILL_DIR}/scripts" && unset http_proxy https_proxy all_proxy && npx -y tsx main.ts)'
# Usage: notion-clip <url> -n "Resources"
How It Works
- Fetch: Launch Chrome via CDP, navigate to URL
- Render: Wait for JavaScript to execute, scroll to trigger lazy loading
- Extract: Run cleanup script to remove ads/nav, extract main content
- Convert: HTML â Markdown â Notion blocks
- Save: Call Notion API to create page or append blocks
Dependencies
- Chrome/Chromium browser (installed locally)
- Node.js (script runs with
tsx; Bun may route through proxy and return empty body, use Node) - Notion API key configured
(Other dependencies auto-install on first run.)
Environment Variables
| Variable | Description |
|---|---|
NOTION_CLIPPER_CHROME_PATH |
Custom Chrome executable path |
NOTION_CLIPPER_CHROME_PROFILE_DIR |
Custom Chrome profile directory |
https_proxy / HTTP_PROXY |
Proxy for Notion API (e.g. http://127.0.0.1:7890) |
http_proxy / HTTPS_PROXY |
Same as above |
all_proxy |
Optional, e.g. socks5://127.0.0.1:7890 |
Example (proxy on port 7890):
export https_proxy=http://127.0.0.1:7890 http_proxy=http://127.0.0.1:7890 all_proxy=socks5://127.0.0.1:7890
Troubleshooting
Network Issues
| Error | Cause | Solution |
|---|---|---|
ECONNREFUSED 208.103.161.1:443 |
DNS returns blocked IP; proxy conflict | 1. Close VPN/proxy software2. Use unset http_proxy https_proxy all_proxy3. Switch network (e.g., mobile hotspot) |
Notion API returned empty body (status 200) |
Using bun which routes through proxy incorrectly |
Run with tsx: npx -y tsx main.ts ... (NOT bun) |
fetch failed or ECONNREFUSED |
Proxy env vars set but Node.js https doesn’t support them |
Either:1. Use network without proxy (unset env vars)2. Ensure proxy allows HTTPS traffic |
CloudFlare 403 |
Direct IP access triggers security protection | Use hostname instead of IP; ensure proper Authorization header |
| Mixed: Sometimes works, sometimes fails | Unstable network or DNS returns different IPs | Script now has 6 retries with exponential backoff (1s, 2s, 4s, 4s…) |
Best Practice: For reliable Notion API access, use a stable network (mobile hotspot often works better than corporate VPN).
Content Issues
| Error | Cause | Solution |
|---|---|---|
Invalid URL for link |
Notion API rejects non-http(s) URLs | Script now removes all markdown links by default to avoid validation errors. Content is preserved, only links are stripped. |
Note: The script automatically removes these invalid URL types:
javascript:,data:,file:,about:protocols- WeChat internal links (
weixin:,wx://) - Relative paths (
/path,./path) - Hash-only links (
#anchor) - Empty links
General Issues
- Chrome not found: Set
NOTION_CLIPPER_CHROME_PATHenvironment variable - Timeout errors: Increase
--timeoutvalue or use--waitmode - Content missing: Try
--waitmode for dynamic/lazy-loaded pages - Notion API error (401/403): Check API key validity and integration permissions
- Notion API error: Ensure integration has access to target database/page
Code Optimizations Applied
The following optimizations have been implemented to handle unstable networks and invalid URLs:
- Auto-retry mechanism: Up to 6 retries with exponential backoff (1s â 2s â 4s â 4s…)
- Increased timeout: 30s for Notion API requests (was 25s)
- URL cleaning: Removes invalid URLs before Notion API submission
- Using tsx: Node.js runtime that properly handles direct connections (unlike Bun)