paper-digest
npx skills add https://github.com/iamseungpil/claude-for-dslab --skill paper-digest
Agent 安装分布
Skill 文档
Paper Digest
Single-paragraph summaries optimized for social sharing. Insight over information.
Structure
- Context: What’s the problem?
- Insight: What did they realize that others missed?
- Solution: How does insight â method? (should feel natural)
- Evidence: Concrete comparison showing it works
Then: Implication line + ð arxiv link
Key Rules
- Explain like reader is smart but unfamiliar with the domain
- Use concrete examples/analogies (e.g., “ì°ë ê¸°íµ ìí ” >> “í¹ì í í°ì ì§ì¤”)
- Show cause-and-effect chains explicitly
- Compare/contrast with alternatives (“X failed while Y succeeded”)
- Bold 2-4 key concepts
- Match user’s language (Korean/English)
Example
Input: arXiv 2601.15380
Output:
Transformerì attentionì “ì´ë¤ í í°ì ì¼ë§ë 볼짔를 ê²°ì íëë°, ì´ ë ¼ë¬¸ì softmax attentionì **Entropic Optimal Transport(EOT)**ë¼ë ìµì í 문ì ì í´ë¡ ì¬í´ìíë¤. ì´ ê´ì ì´ ì£¼ë íµì°°ì: attention ê³ì°ìë ì묵ì ì¼ë¡ “모ë ìì¹ê° ëë±íê² ì¤ìí뤔ë uniform priorê° ì¨ì´ìë¤ë ê²ì´ë¤. ì´ê² ì 문ì ì¸ê°? LLMìì 첫 ë²ì§¸ í í°ì´ ì미ì 무ê´íê² ìì²ë attentionì ë°ë attention sink íìì´ ìë¤. Softmaxë í©ì´ 1ì¸ íë¥ ì ì¶ë ¥í´ì¼ íë¯ë¡, queryê° ë§ë í ë³¼ í í°ì´ ìì ë attentionì “ë²ë¦´ ê³³”ì´ íìíë°, uniform prior íìì ì´ë¥¼ 구ííë ¤ë©´ 첫 í í°ì key vectorê° “ëë ì°ë 기íµì´ì¼”ë¼ë 구조ì ì ë³´ê¹ì§ ë´ìì¼ íë¤âìë semantic contentë§ ííí´ì¼ í keyì ííë ¥ì´ ëë¹ëë ê²ì´ë¤. EOT í´ìì´ ì´ ë¬¸ì 를 ëë¬ë´ì£¼ìì¼ë¯ë¡, í´ê²°ì± ë ìì°ì¤ë½ë¤: prior를 uniformìì learnableë¡ ë°ê¾¸ë©´ ëë¤. ì´ ë ¼ë¬¸ì´ ì ìíë GOATì “ê° ìì¹ì 기본 ì¤ìë”를 ë³ëì íìµ ê°ë¥í íì¼ë¡ ë¶ë¦¬í´ì, key vectorë ììíê² ì미ë§, ìì¹ ì ë³´ë priorê° ë´ë¹íê² íë¤. ì¤íìì 기존 ë°©ë²ë¤ì´ íë ¨ ê¸¸ì´ ì´ê³¼ ì ê¸ê²©í ì¤í¨í ë°ë©´, GOATì 긴 문맥ììë ì ë³´ ê²ì ì±ë¥ì ì ì§íë¤.
Implication: EOT ê´ì ì attentionì ì¨ê²¨ì§ ê°ì ì ëë¬ë´ê³ , ê·¸ ê°ì ì ë°ê¿ ì ìë¤ë ì¤ê³ ìì ë를 ì´ì´ì¤ë¤âattention sinkë uniform priorì ë¶ì°ë¬¼ì´ë©°, prior를 ëª ìì ì¼ë¡ 모ë¸ë§íë©´ í´ê²°ëë¤.
ð https://arxiv.org/abs/2601.15380
Avoid
- Jargon without intuition
- Findings without comparison to alternatives
- Method description without motivation (“ì ì´ë ê² íëì§” ìì´ “ì´ë ê² í뤔ë§)
Multiple Papers
When summarizing multiple papers:
- Lead with the unifying theme/problem
- Contrast what each paper realized differently
- Synthesize implications across papers
Language
Match the user’s language (Korean/English). Maintain the same insight-first structure regardless of language.