typst-paper
npx skills add https://github.com/bahayonghang/my-claude-code-settings --skill typst-paper
Agent 安装分布
Skill 文档
Typst 妿¯è®ºæå©æ
æ ¸å¿åå
- ç»ä¸ä¿®æ¹
@citeã@refã@labelãæ°å¦ç¯å¢å çå 容 - ç»ä¸å空æé åèæç®æ¡ç®
- ç»ä¸å¨æªç»è®¸å¯çæ åµä¸ä¿®æ¹ä¸ä¸æ¯è¯
- å§ç»å 以注éå½¢å¼è¾åºä¿®æ¹å»ºè®®
- Typst ç¼è¯éåº¦å¿«ï¼æ¯«ç§çº§ï¼ï¼éå宿¶é¢è§
åæ°çº¦å®ï¼$ARGUMENTSï¼
$ARGUMENTSç¨äºæ¥æ¶ä¸».typè·¯å¾ãç®æ ç« èãæ¨¡åéæ©çå ³é®ä¿¡æ¯ã- è¥
$ARGUMENTS缺失æå«ç³ï¼å 询é®ï¼ä¸».typè·¯å¾ãç®æ èå´ãæé模åã - è·¯å¾æåé¢å¤çï¼ä¸æ¨ææè¡¥å ¨æªæä¾çè·¯å¾ã
æ§è¡çº¦æ
- ä» å¨ç¨æ·æç¡®è¦æ±æ¶æ§è¡èæ¬/ç¼è¯å½ä»¤ã
- æ¶åæ¸ çæè¦çè¾åºæä»¶çæä½åå 确认ã
ç»ä¸è¾åºåè®®ï¼å ¨é¨æ¨¡åï¼
æ¯æ¡å»ºè®®å¿ é¡»å å«åºå®å段ï¼
- 严é级å«ï¼Critical / Major / Minor
- ä¼å 级ï¼P0ï¼é»æï¼/ P1ï¼éè¦ï¼/ P2ï¼å¯æ¹è¿ï¼
é»è®¤æ³¨é模æ¿ï¼diff-comment 飿 ¼ï¼ï¼
// <模å>ï¼ç¬¬<N>è¡ï¼[Severity: <Critical|Major|Minor>] [Priority: <P0|P1|P2>]: <é®é¢æ¦è¿°>
// åæï¼...
// ä¿®æ¹åï¼...
// çç±ï¼...
// â ï¸ ãå¾
è¡¥è¯ãï¼<éè¦è¯æ®/æ°æ®æ¶æ è®°>
失败å¤çï¼å ¨å±ï¼
å·¥å ·/èæ¬æ æ³æ§è¡æ¶ï¼è¾åºå å«åå ä¸å»ºè®®ç注éåï¼
// ERROR [Severity: Critical] [Priority: P0]: <ç®è¦é误>
// åå ï¼<缺å°å·¥å
·æè·¯å¾æ æ>
// 建议ï¼<å®è£
å·¥å
·/æ ¸å¯¹è·¯å¾/éè¯å½ä»¤>
å¸¸è§æ åµï¼
- èæ¬ä¸åå¨ï¼ç¡®è®¤
scripts/è·¯å¾ä¸å·¥ä½ç®å½ - Typst æªå®è£
ï¼å»ºè®®éè¿
cargo install typst-cliæå 管çå¨å®è£ - åä½ç¼ºå¤±ï¼ä½¿ç¨
typst fontsæ¥çå¯ç¨åä½ - æä»¶ä¸åå¨ï¼è¯·ç¨æ·æä¾æ£ç¡®
.typè·¯å¾ - ç¼è¯å¤±è´¥ï¼ä¼å å®ä½é¦ä¸ªéè¯¯å¹¶è¯·æ±æ¥å¿ç段
模åï¼ç¬ç«è°ç¨ï¼
模åï¼ç¼è¯
触åè¯: compile, ç¼è¯, build, typst compile, typst watch
Typst ç¼è¯å½ä»¤:
| å½ä»¤ | ç¨é | 说æ |
|---|---|---|
typst compile main.typ |
忬¡ç¼è¯ | çæ PDF æä»¶ |
typst watch main.typ |
çè§æ¨¡å¼ | æä»¶ååæ¶èªå¨éæ°ç¼è¯ |
typst compile main.typ output.pdf |
æå®è¾åº | èªå®ä¹è¾åºæä»¶å |
typst compile --format png main.typ |
å ¶ä»æ ¼å¼ | æ¯æ PNGãSVG çæ ¼å¼ |
typst fonts |
åä½å表 | æ¥çç³»ç»å¯ç¨åä½ |
使ç¨ç¤ºä¾:
# åºç¡ç¼è¯ï¼æ¨èï¼
typst compile main.typ
# çè§æ¨¡å¼ï¼å®æ¶é¢è§ï¼
typst watch main.typ
# æå®è¾åºç®å½
typst compile main.typ --output build/paper.pdf
# 导åºä¸º PNGï¼ç¨äºé¢è§ï¼
typst compile --format png main.typ
# æ¥çå¯ç¨åä½
typst fonts
# 使ç¨èªå®ä¹åä½è·¯å¾
typst compile --font-path ./fonts main.typ
ç¼è¯é度ä¼å¿:
- Typst ç¼è¯é度é叏卿¯«ç§çº§ï¼vs LaTeX çç§çº§ï¼
- å¢éç¼è¯ï¼åªéæ°ç¼è¯ä¿®æ¹çé¨å
- éå宿¶é¢è§åå¿«éè¿ä»£
ä¸ææ¯æ:
// 䏿åä½é
置示ä¾
#set text(
font: ("Source Han Serif", "Noto Serif CJK SC"),
lang: "zh",
region: "cn"
)
模åï¼æ ¼å¼æ£æ¥
触åè¯: format, æ ¼å¼æ£æ¥, lint, style check
æ£æ¥é¡¹ç®:
| ç±»å« | æ£æ¥å 容 | æ å |
|---|---|---|
| é¡µè¾¹è· | ä¸ä¸å·¦å³è¾¹è· | é常 1 è±å¯¸ï¼2.54cmï¼ |
| è¡é´è· | åå/ååè¡è· | æ ¹æ®æåè¦æ± |
| åä½ | æ£æåä½ä¸å¤§å° | Times New Roman 10-12pt |
| æ é¢ | å级æ 颿 ¼å¼ | 屿¬¡æ¸ æ°ï¼ç¼å·æ£ç¡® |
| å¾è¡¨ | æ é¢ä½ç½®ä¸æ ¼å¼ | å¾ä¸è¡¨ä¸ï¼ç¼å·è¿ç» |
| å¼ç¨ | å¼ç¨æ ¼å¼ä¸è´æ§ | æ°å/ä½è -å¹´ä»½æ ¼å¼ |
Typst æ ¼å¼æ£æ¥è¦ç¹:
// 页é¢è®¾ç½®
#set page(
paper: "a4", // æ "us-letter"
margin: (x: 2.5cm, y: 2.5cm)
)
// ææ¬è®¾ç½®
#set text(
font: "Times New Roman",
size: 11pt,
lang: "en"
)
// 段è½è®¾ç½®
#set par(
justify: true,
leading: 0.65em,
first-line-indent: 1.5em
)
// æ é¢è®¾ç½®
#set heading(numbering: "1.1")
å¸¸è§æ ¼å¼é®é¢:
- â 页边è·ä¸ä¸è´
- â å使··ç¨ï¼ä¸è±æå使ªå离ï¼
- â å¾è¡¨ç¼å·ä¸è¿ç»
- â å¼ç¨æ ¼å¼ä¸ç»ä¸
模åï¼è¯æ³åæï¼è±æï¼
触åè¯: grammar, è¯æ³, proofread, 润è², article usage
éç¹æ£æ¥é¢å:
- 主è°ä¸è´
- å è¯ä½¿ç¨ï¼a/an/theï¼
- æ¶æä¸è´æ§ï¼æ¹æ³ç¨è¿å»æ¶ï¼ç»æç¨ç°å¨æ¶ï¼
- Chinglish æ£æµ
è¾åºæ ¼å¼:
// GRAMMARï¼ç¬¬23è¡ï¼[Severity: Major] [Priority: P1]: å è¯ç¼ºå¤±
// åæï¼We propose method for...
// ä¿®æ¹åï¼We propose a method for...
// çç±ï¼åæ°å¯æ°åè¯å缺å°ä¸å®å è¯
常è§è¯æ³é误:
| é误类å | ç¤ºä¾ | ä¿®æ£ |
|---|---|---|
| å è¯ç¼ºå¤± | propose method | propose a method |
| 主è°ä¸ä¸è´ | The data shows | The data show |
| æ¶ææ··ä¹± | We proposed… The results shows | We proposed… The results show |
| Chinglish | more and more | increasingly |
模åï¼é¿é¾å¥åæ
触åè¯: long sentence, é¿å¥, simplify, decompose, æè§£
è§¦åæ¡ä»¶:
- è±æï¼å¥å >50 è¯ æ >3 个ä»å¥
- 䏿ï¼å¥å >60 å æ >3 个åå¥
è¾åºæ ¼å¼:
// é¿é¾å¥æ£æµï¼ç¬¬45è¡ï¼å
±67è¯ï¼[Severity: Minor] [Priority: P2]
// 主干ï¼[ä¸»è¯ + è°è¯ + 宾è¯]
// 修饰æåï¼
// - [å
³ç³»ä»å¥] which...
// - [ç®çç¶è¯] to...
// 建议æ¹åï¼[ç®åçæ¬]
æåçç¥:
- è¯å«ä¸»å¹²ç»æ
- æå修饰æå
- æå为å¤ä¸ªçå¥
- ä¿æé»è¾è¿è´¯æ§
模åï¼å¦æ¯è¡¨è¾¾
触åè¯: academic tone, 妿¯è¡¨è¾¾, improve writing, weak verbs
è±æå¦æ¯è¡¨è¾¾:
| â å¼±å¨è¯ | â 妿¯æ¿ä»£ |
|---|---|
| use | employ, utilize, leverage |
| get | obtain, achieve, acquire |
| make | construct, develop, generate |
| show | demonstrate, illustrate, indicate |
䏿妿¯è¡¨è¾¾:
| â å£è¯å | â 妿¯å |
|---|---|
| å¾å¤ç 究表æ | 大éç 究表æ |
| ææå¾å¥½ | å ·ææ¾èä¼å¿ |
| æä»¬ä½¿ç¨ | æ¬æéç¨ |
| å¯ä»¥çåº | ç±æ¤å¯è§ |
ä½¿ç¨æ¹å¼ï¼ç¨æ·æä¾æ®µè½æºç ï¼Agent åæå¹¶è¿å润è²çæ¬å对æ¯è¡¨æ ¼ã
è¾åºæ ¼å¼ï¼Markdown 对æ¯è¡¨æ ¼ï¼:
| Original / åæ | Revised / æ¹è¿çæ¬ | Issue Type / é®é¢ç±»å | Rationale / ä¼åçç± |
|-----------------|---------------------|----------------------|---------------------|
| We use machine learning to get better results. | We employ machine learning to achieve superior performance. | Weak verbs | Replace "use" â "employ", "get" â "achieve" for academic tone |
| æä»¬ä½¿ç¨äºæ·±åº¦å¦ä¹ æ¹æ³ã | æ¬æéç¨æ·±åº¦å¦ä¹ æ¹æ³è¿è¡ç¹å¾æåã | å£è¯å表达 | "æä»¬ä½¿ç¨" â "æ¬æéç¨"ï¼å¦æ¯è§èï¼ï¼è¡¥å
æ¹æ³ç¨é |
å¤éæ ¼å¼ï¼æºç å æ³¨éï¼:
// EXPRESSIONï¼ç¬¬23è¡ï¼[Severity: Minor] [Priority: P2]: æå妿¯è¯æ°
// åæï¼We use machine learning to get better results.
// ä¿®æ¹åï¼We employ machine learning to achieve superior performance.
// çç±ï¼ç¨å¦æ¯æ¿ä»£è¯æ¿æ¢å¼±å¨è¯
模åï¼é»è¾è¡æ¥ä¸æ¹æ³è®ºæ·±åº¦
触åè¯: logic, coherence, é»è¾, è¡æ¥, methodology, æ¹æ³è®º, 论è¯, argument
ç®æ ï¼ç¡®ä¿æ®µè½é´é»è¾æµç ï¼å¼ºåæ¹æ³è®ºç严谨æ§ã
éç¹æ£æ¥é¢åï¼
1. 段è½çº§é»è¾è¡æ¥ï¼AXES 模åï¼ï¼
| ç»æé¨å | 说æ | ç¤ºä¾ |
|---|---|---|
| Assertionï¼ä¸»å¼ ï¼ | æ¸ æ°ç主é¢å¥ï¼éè¿°æ ¸å¿è§ç¹ | “注æåæºå¶è½å¤æååºå建模ææã” |
| Xampleï¼ä¾è¯ï¼ | æ¯æä¸»å¼ çå ·ä½è¯æ®ææ°æ® | “å®éªä¸ï¼æ³¨æåæºå¶è¾¾å°95%åç¡®çã” |
| Explanationï¼è§£éï¼ | åæè¯æ®ä¸ºä½æ¯æä¸»å¼ | “è¿ä¸æåæºäºå ¶æè·é¿ç¨ä¾èµçè½åã” |
| Significanceï¼æä¹ï¼ | 䏿´å¹¿æ³è®ºç¹æä¸ä¸æ®µçèç³» | “è¿ä¸åç°ä¸ºæ¬ææ¶æè®¾è®¡æä¾äºä¾æ®ã” |
2. è¿æ¸¡ä¿¡å·è¯ï¼
| å ³ç³»ç±»å | ä¸æä¿¡å·è¯ | è±æå¯¹åº |
|---|---|---|
| éè¿ | æ¤å¤ãè¿ä¸æ¥ãæ´éè¦çæ¯ | furthermore, moreover |
| 转æ | ç¶èã使¯ãç¸å | however, nevertheless |
| å æ | å æ¤ãç±æ¤å¯è§ãæ è | therefore, consequently |
| é¡ºåº | é¦å ãéåãæå | first, subsequently, finally |
| ä¸¾ä¾ | ä¾å¦ãå ·ä½èè¨ãç¹å«æ¯ | for instance, specifically |
3. æ¹æ³è®ºæ·±åº¦æ£æ¥æ¸ åï¼
- æ¯ä¸ªä¸»å¼ é½æè¯æ®æ¯æï¼æ°æ®ãå¼ç¨æé»è¾æ¨çï¼
- æ¹æ³éæ©æå åçç±ï¼ä¸ºä½éæ¤æ¹æ³èéå ¶ä»ï¼ï¼
- æç¡®æ¿è®¤ç ç©¶å±éæ§
- æ¸ æ°éè¿°åæå设
- å¯å¤ç°æ§ç»èå åï¼åæ°ãæ°æ®éãè¯ä¼°ææ ï¼
4. 常è§é®é¢ï¼
| é®é¢ç±»å | è¡¨ç° | ä¿®æ£æ¹æ³ |
|---|---|---|
| é»è¾æå± | 段è½é´ç¼ºä¹è¡æ¥ | æ·»å è¿æ¸¡å¥è¯´ææ®µè½å ³ç³» |
| æ æ®ä¸»å¼ | æè¨ç¼ºä¹è¯æ®æ¯æ | è¡¥å å¼ç¨ãæ°æ®ææ¨ç |
| æ¹æ³è®ºæµ è | “æ¬æéç¨X”使 çç± | è§£é为ä½Xé忬é®é¢ |
| éå«å设 | åææ¡ä»¶æªæç¤º | æ¾å¼éè¿°å设æ¡ä»¶ |
è¾åºæ ¼å¼ï¼
// é»è¾è¡æ¥ï¼ç¬¬45è¡ï¼[Severity: Major] [Priority: P1]: 段è½é´é»è¾æå±
// é®é¢ï¼ä»é®é¢æè¿°ç´æ¥è·³è½¬å°è§£å³æ¹æ¡ï¼ç¼ºä¹è¿æ¸¡
// åæï¼æ°æ®åå¨åªå£°ãæ¬ææåºä¸ç§æ»¤æ³¢æ¹æ³ã
// ä¿®æ¹åï¼æ°æ®åå¨åªå£°ï¼è¿å¯¹åç»åæé æå¹²æ°ãå æ¤ï¼æ¬ææåºä¸ç§æ»¤æ³¢æ¹æ³ä»¥è§£å³è¯¥é®é¢ã
// çç±ï¼æ·»å å æè¿æ¸¡ï¼è¿æ¥é®é¢ä¸è§£å³æ¹æ¡
// æ¹æ³è®ºæ·±åº¦ï¼ç¬¬78è¡ï¼[Severity: Major] [Priority: P1]: æ¹æ³éæ©ç¼ºä¹è®ºè¯
// é®é¢ï¼æ¹æ³éæ©æªè¯´æçç±
// åæï¼æ¬æéç¨ResNetä½ä¸ºéª¨å¹²ç½ç»ã
// ä¿®æ¹åï¼æ¬æéç¨ResNetä½ä¸ºéª¨å¹²ç½ç»ï¼å
¶æ®å·®è¿æ¥ç»æè½ææç¼è§£æ¢¯åº¦æ¶å¤±é®é¢ï¼ä¸å¨ç¹å¾æåä»»å¡ä¸è¡¨ç°ä¼å¼ã
// çç±ï¼ç¨ææ¯åçè®ºè¯æ¶æéæ©
åç« èæåï¼
| ç« è | é»è¾è¡æ¥éç¹ | æ¹æ³è®ºæ·±åº¦éç¹ |
|---|---|---|
| Abstract | ç®çâæ¹æ³âç»æâç»è®ºçæµç è¡æ¥ | çªåºæ ¸å¿è´¡ç® |
| Introduction | é®é¢â空ç½âè´¡ç®çæµç è¡æ¥ | 论è¯ç ç©¶æä¹ |
| Related Work | æä¸»é¢åç»ï¼æ¾å¼å¯¹æ¯ | å®ä½ä¸å人工ä½çå ³ç³» |
| Methods | æ¥éª¤é´é»è¾éè¿ | è®ºè¯æ¯ä¸ªè®¾è®¡éæ© |
| Experiments | 设置âç»æâåæçæµç¨ | è§£éè¯ä¼°ææ éæ© |
| Discussion | åç°âå¯ç¤ºâå±éçè¡æ¥ | æ¿è®¤ç ç©¶è¾¹ç |
æä½³å®è·µï¼åè ElsevierãProof-Reading-Serviceï¼ï¼
- 䏿®µä¸ä¸»é¢ï¼æ¯æ®µèç¦å䏿 ¸å¿è§ç¹
- 主é¢å¥å è¡ï¼æ®µé¦å³éè¿°æ¬æ®µä¸»å¼
- è¯æ®é¾å®æ´ï¼æ¯ä¸ªä¸»å¼ é½éæ¯æï¼æ°æ®ãå¼ç¨æé»è¾ï¼
- æ¾å¼è¿æ¸¡ï¼ä½¿ç¨ä¿¡å·è¯æ ææ®µè½å ³ç³»
- 论è¯èéæè¿°ï¼è§£é”为佔ï¼èéä» éè¿°”æ¯ä»ä¹”
模åï¼ç¿»è¯ï¼ä¸è¯è±ï¼
触åè¯: translate, ç¿»è¯, ä¸è¯è±, Chinese to English
ç¿»è¯æµç¨:
æ¥éª¤ 1ï¼é¢åè¯å« ç¡®å®ä¸ä¸é¢åæ¯è¯ï¼
- 深度å¦ä¹ ï¼neural networks, attention, loss functions
- æ¶é´åºåï¼forecasting, ARIMA, temporal patterns
- 工䏿§å¶ï¼PID, fault detection, SCADA
æ¥éª¤ 2ï¼æ¯è¯ç¡®è®¤
| 䏿 | English | é¢å |
|------|---------|------|
| 注æåæºå¶ | attention mechanism | DL |
| æ¶é´åºå颿µ | time series forecasting | TS |
æ¥éª¤ 3ï¼ç¿»è¯å¹¶æ³¨é
// åæï¼æ¬ææåºäºä¸ç§åºäºTransformerçæ¹æ³
// è¯æï¼We propose a Transformer-based approach
// 注éï¼"æ¬ææåº" â "We propose"ï¼å¦æ¯æ å表达ï¼
æ¥éª¤ 4ï¼Chinglish æ£æ¥
| ä¸å¼è±è¯ | å°é表达 |
|---|---|
| more and more | increasingly |
| in recent years | recently |
| play an important role | is crucial for |
常ç¨å¦æ¯å¥å¼:
| 䏿 | English |
|---|---|
| æ¬ææåº… | We propose… / This paper presents… |
| å®éªç»æè¡¨æ… | Experimental results demonstrate that… |
| ä¸…ç¸æ¯ | Compared with… / In comparison to… |
| ç»¼ä¸æè¿° | In summary / In conclusion |
模åï¼åèæç®
触åè¯: bib, bibliography, åèæç®, citation, å¼ç¨
Typst åèæç®ç®¡ç:
æ¹æ³ 1ï¼ä½¿ç¨ BibTeX æä»¶
#bibliography("references.bib", style: "ieee")
æ¹æ³ 2ï¼ä½¿ç¨ Hayagriva æ ¼å¼
#bibliography("references.yml", style: "apa")
æ¯æçå¼ç¨æ ·å¼:
ieee– IEEE æ°åå¼ç¨apa– APA ä½è -年份chicago-author-date– èå å¥ä½è -年份mla– MLA 人æå¦ç§gb-7714-2015– ä¸å½å½æ
å¼ç¨ç¤ºä¾:
// æä¸å¼ç¨
According to @smith2020, the method...
Recent studies @smith2020 @jones2021 show...
// åèæç®å表
#bibliography("references.bib", style: "ieee")
æ£æ¥é¡¹ç®:
- å¿ å¡«åæ®µå®æ´æ§
- é夿¡ç®æ£æµ
- æªä½¿ç¨æ¡ç®
- å¼ç¨æ ¼å¼ä¸è´æ§
模åï¼å»AIåç¼è¾
触åè¯: deai, å»AIå, humanize, reduce AI traces, éä½AIç迹
ç®æ ï¼å¨ä¿æ Typst è¯æ³åææ¯åç¡®æ§çåæä¸ï¼éä½ AI åä½ç迹ã
è¾å ¥è¦æ±ï¼
- æºç ç±»åï¼å¿ å¡«ï¼ï¼Typst
- ç« èï¼å¿ å¡«ï¼ï¼Abstract / Introduction / Related Work / Methods / Experiments / Results / Discussion / Conclusion
- æºç çæ®µï¼å¿ å¡«ï¼ï¼ç´æ¥ç²è´´ï¼ä¿çå缩è¿ä¸æ¢è¡ï¼
工使µç¨ï¼
1. è¯æ³ç»æè¯å« æ£æµ Typst è¯æ³ï¼å®æ´ä¿çï¼
- 彿°è°ç¨ï¼
#set,#show,#let - å¼ç¨ï¼
@cite,@ref,@label - æ°å¦ï¼
$...$,$ ... $ï¼åçº§ï¼ - æ è®°ï¼
*bold*,_italic_,`code` - èªå®ä¹å½æ°ï¼é»è®¤ä¸æ¹ï¼
2. AI çè¿¹æ£æµ:
| ç±»å | ç¤ºä¾ | é®é¢ |
|---|---|---|
| 空è¯å£å· | significant, comprehensive, effective | 缺ä¹å ·ä½æ§ |
| è¿åº¦ç¡®å® | obviously, necessarily, completely | è¿äºç»å¯¹ |
| æºæ¢°ææ¯ | æ å®è´¨å 容ç䏿®µå¼ | ç¼ºä¹æ·±åº¦ |
| 模æ¿è¡¨è¾¾ | in recent years, more and more | éè¯æ»¥è° |
3. ææ¬æ¹åï¼ä» æ¹å¯è§ææ¬ï¼ï¼
- æåé¿å¥ï¼è±æ >50 è¯ï¼ä¸æ >50 åï¼
- è°æ´è¯åºä»¥ç¬¦åèªç¶è¡¨è¾¾
- ç¨å ·ä½ä¸»å¼ æ¿æ¢ç©ºæ³è¡¨è¿°
- å é¤åä½çè¯
- è¡¥å å¿ è¦ä¸»è¯ï¼ä¸å¼å ¥æ°äºå®ï¼
4. è¾åºçæï¼
// ============================================================
// å»AIåç¼è¾ï¼ç¬¬23è¡ - Introductionï¼
// ============================================================
// åæï¼This method achieves significant performance improvement.
// ä¿®æ¹åï¼The proposed method improves performance in the experiments.
//
// æ¹å¨è¯´æï¼
// 1. å é¤ç©ºè¯ï¼"significant" â å é¤
// 2. ä¿çåæä¸»å¼ ï¼é¿å
æ°å¢å
·ä½ææ
//
// â ï¸ ãå¾
è¡¥è¯ï¼éè¦å®éªæ°æ®æ¯æï¼è¡¥å
å
·ä½ææ ã
// ============================================================
= Introduction
The proposed method improves performance in the experiments...
硬æ§çº¦æï¼
- ç»ä¸ä¿®æ¹ï¼
@cite,@ref,@label, æ°å¦ç¯å¢ - ç»ä¸æ°å¢ï¼äºå®ãæ°æ®ãç»è®ºãææ ãå®éªè®¾ç½®ãå¼ç¨ç¼å·
- ä» ä¿®æ¹ï¼æ®éæ®µè½æåãæ é¢ææ¬
åç« èååï¼
| ç« è | éç¹ | 约æ |
|---|---|---|
| Abstract | ç®ç/æ¹æ³/å ³é®ç»æï¼å¸¦æ°åï¼/ç»è®º | ç¦æ³æ³è´¡ç® |
| Introduction | éè¦æ§â空ç½âè´¡ç®ï¼å¯æ ¸æ¥ï¼ | å å¶æªè¾ |
| Related Work | æè·¯çº¿åç»ï¼å·®å¼ç¹å ·ä½å | å ·ä½å¯¹æ¯ |
| Methods | å¯å¤ç°ä¼å ï¼æµç¨ãåæ°ãææ å®ä¹ï¼ | å®ç°ç»è |
| Results | ä» æ¥åäºå®ä¸æ°å¼ | ä¸è§£éåå |
| Discussion | 讲æºå¶ãè¾¹çã失败ãå±é | æ¹å¤æ§åæ |
| Conclusion | åçç ç©¶é®é¢ï¼ä¸å¼å ¥æ°å®éª | 坿§è¡æªæ¥å·¥ä½ |
模åï¼æ é¢ä¼å
触åè¯: title, æ é¢, title optimization, create title, improve title
ç®æ ï¼æ ¹æ® IEEE/ACM/Springer/NeurIPS æä½³å®è·µï¼çæåä¼å妿¯è®ºææ é¢ã
使ç¨ç¤ºä¾ï¼
æ ¹æ®å å®¹çææ é¢ï¼
python scripts/optimize_title.py main.typ --generate
# åææè¦/å¼è¨ï¼æåº 3-5 个æ é¢åéæ¹æ¡
ä¼åç°ææ é¢ï¼
python scripts/optimize_title.py main.typ --optimize
# åæå½åæ é¢å¹¶æä¾æ¹è¿å»ºè®®
æ£æ¥æ é¢è´¨éï¼
python scripts/optimize_title.py main.typ --check
# æ ¹æ®æä½³å®è·µè¯ä¼°æ é¢ï¼è¯å 0-100ï¼
æ é¢è´¨éæ åï¼åºäº IEEE Author Center å顶级ä¼è®®/æåï¼ï¼
| æ å | æé | 说æ |
|---|---|---|
| ç®æ´æ§ | 25% | å é¤ “A Study of”, “Research on”, “Novel”, “New” |
| å¯æç´¢æ§ | 30% | æ ¸å¿æ¯è¯ï¼æ¹æ³+é®é¢ï¼å¨å 65 å符å |
| é¿åº¦ | 15% | æä½³ï¼10-15 è¯ï¼è±æï¼/ 15-25 åï¼ä¸æï¼ |
| å ·ä½æ§ | 20% | å ·ä½æ¹æ³/é®é¢åç§°ï¼é¿å æ³æ³èè° |
| è§èæ§ | 10% | é¿å çå»ç¼©åï¼é¤ AI, LSTM, DNA çéè¯ç¼©åï¼ |
æ é¢çæå·¥ä½æµï¼
æ¥éª¤ 1ï¼å å®¹åæ ä»æè¦/å¼è¨ä¸æåï¼
- ç ç©¶é®é¢ï¼è§£å³ä»ä¹ææï¼
- ç ç©¶æ¹æ³ï¼æåºä»ä¹æ¹æ³ï¼
- åºç¨é¢åï¼ä»ä¹åºç¨åºæ¯ï¼
- æ ¸å¿è´¡ç®ï¼ä¸»è¦æææ¯ä»ä¹ï¼ï¼å¯éï¼
æ¥éª¤ 2ï¼å ³é®è¯æå è¯å« 3-5 ä¸ªæ ¸å¿å ³é®è¯ï¼
- æ¹æ³å ³é®è¯ï¼”Transformer”, “Graph Neural Network”, “Reinforcement Learning”
- é®é¢å ³é®è¯ï¼”Time Series Forecasting”, “Fault Detection”, “Image Segmentation”
- é¢åå ³é®è¯ï¼”Industrial Control”, “Medical Imaging”, “Autonomous Driving”
æ¥éª¤ 3ï¼æ 颿¨¡æ¿éæ© é¡¶çº§ä¼è®®/æåå¸¸ç¨æ¨¡å¼ï¼
| æ¨¡å¼ | 示ä¾ï¼è±æï¼ | 示ä¾ï¼ä¸æï¼ | éç¨åºæ¯ |
|---|---|---|---|
| Method for Problem | “Transformer for Time Series Forecasting” | “æ¶é´åºå颿µçTransformeræ¹æ³” | éç¨ç ç©¶ |
| Method: Problem in Domain | “Graph Neural Networks: Fault Detection in Industrial Systems” | “å¾ç¥ç»ç½ç»ï¼å·¥ä¸ç³»ç»æ éæ£æµ” | é¢åä¸é¡¹ |
| Problem via Method | “Time Series Forecasting via Attention Mechanisms” | “åºäºæ³¨æåæºå¶çæ¶é´åºå颿µ” | æ¹æ³èç¦ |
| Method + Key Feature | “Lightweight Transformer for Real-Time Detection” | “è½»é级Transformer宿¶æ£æµæ¹æ³” | æ§è½èç¦ |
æ¥éª¤ 4ï¼çææ é¢åé çæ 3-5 个ä¸åä¾§éçåéæ é¢ï¼
- æ¹æ³ä¾§éå
- é®é¢ä¾§éå
- åºç¨ä¾§éå
- 平衡åï¼æ¨èï¼
- ç®æ´åä½
æ¥éª¤ 5ï¼è´¨éè¯å æ¯ä¸ªåéæ é¢è·å¾ï¼
- æ»ä½è¯åï¼0-100ï¼
- åæ åç»åè¯å
- å ·ä½æ¹è¿å»ºè®®
æ é¢ä¼åè§åï¼
â å 餿 æè¯æ±ï¼
è±æï¼
| é¿å ä½¿ç¨ | åå |
|---|---|
| A Study of | Redundant (all papers are studies) |
| Research on | Redundant (all papers are research) |
| Novel / New | Implied by publication |
| Improved / Enhanced | Vague without specifics |
| Based on | Often unnecessary |
| Using / Utilizing | Can be replaced with prepositions |
䏿ï¼
| é¿å ä½¿ç¨ | åå |
|---|---|
| å ³äº…çç ç©¶ | åä½ï¼ææè®ºæé½æ¯ç ç©¶ï¼ |
| …çæ¢ç´¢ | åä½ä¸ä¸å ·ä½ |
| æ°å / æ°é¢ç | åè¡¨å³æå³çæ°é¢ |
| æ¹è¿ç / ä¼åç | ä¸å ·ä½ï¼é说æå¦ä½æ¹è¿ |
| åºäº…ç | å¯ç®åä¸ºç´æ¥è¡¨è¿° |
â æ¨èç»æï¼
è±æç¤ºä¾ï¼
Good: "Transformer for Time Series Forecasting in Industrial Control"
Bad: "A Novel Study on Improved Time Series Forecasting Using Transformers"
Good: "Graph Neural Networks for Fault Detection"
Bad: "Research on Novel Fault Detection Based on GNNs"
Good: "Attention-Based LSTM for Multivariate Time Series Prediction"
Bad: "An Improved LSTM Model Using Attention Mechanism for Prediction"
ä¸æç¤ºä¾ï¼
好ï¼å·¥ä¸æ§å¶ç³»ç»æ¶é´åºå颿µçTransformeræ¹æ³
å·®ï¼å
³äºåºäºTransformerç工䏿§å¶ç³»ç»æ¶é´åºå颿µçç ç©¶
好ï¼å¾ç¥ç»ç½ç»æ
éæ£æµæ¹æ³åå
¶å·¥ä¸åºç¨
å·®ï¼æ°åæ¹è¿çåºäºå¾ç¥ç»ç½ç»çæ
éæ£æµæ¹æ³ç ç©¶
å¥½ï¼æ³¨æåæºå¶çå¤åéæ¶é´åºå颿µæ¹æ³
å·®ï¼åºäºæ³¨æåæºå¶çæ¹è¿åå¤åéæ¶é´åºå颿µæ¨¡åç ç©¶
å ³é®è¯å¸å±çç¥ï¼
- å 65 å符ï¼è±æï¼/ å 20 åï¼ä¸æï¼ï¼æéè¦çå ³é®è¯ï¼æ¹æ³+é®é¢ï¼
- é¿å å¼å¤´ï¼Articles (A, An, The) / “å ³äº”ã”对五
- ä¼å 使ç¨ï¼åè¯åææ¯æ¯è¯ï¼èéå¨è¯å形容è¯
缩å使ç¨ååï¼
| â 坿¥å | â æ é¢ä¸é¿å |
|---|---|
| AI, ML, DL | Obscure domain-specific acronyms |
| LSTM, GRU, CNN | Chemical formulas (unless very common) |
| IoT, 5G, GPS | Lab-specific abbreviations |
| DNA, RNA, MRI | Non-standard method names |
ä¼è®®/æåç¹æ®è¦æ±ï¼
IEEE Transactionsï¼
- é¿å 另䏿 çå ¬å¼ï¼é¤éå¾ç®åï¼å¦ “NdâFeâB”ï¼
- ä½¿ç¨ Title Caseï¼ä¸»è¦è¯é¦åæ¯å¤§åï¼
- å ¸åé¿åº¦ï¼10-15 è¯
- 示ä¾ï¼”Deep Learning for Predictive Maintenance in Smart Manufacturing”
ACM Conferencesï¼
- å¯ä½¿ç¨æ´æåæçæ é¢
- å¯ä½¿ç¨åå·æ·»å 坿 é¢
- å ¸åé¿åº¦ï¼8-12 è¯
- 示ä¾ï¼”AttentionFlow: Visualizing Attention Mechanisms in Neural Networks”
Springer Journalsï¼
- å好æè¿°æ§èéåææ§
- å¯ç¨é¿ï¼æå¤ 20 è¯ï¼
- 示ä¾ï¼”A Comprehensive Framework for Real-Time Anomaly Detection in Industrial IoT Systems”
NeurIPS/ICMLï¼
- ç®æ´æåï¼8-12 è¯ï¼
- æ¹æ³åé常çªåº
- 示ä¾ï¼”Transformers Learn In-Context by Gradient Descent”
è¾åºæ ¼å¼ï¼
è±æè®ºæï¼
// ============================================================
// TITLE OPTIMIZATION REPORT
// ============================================================
// Current Title: "A Novel Study on Time Series Forecasting Using Deep Learning"
// Quality Score: 45/100
//
// Issues Detected:
// 1. [Critical] Contains "Novel Study" (remove ineffective words)
// 2. [Major] Vague method description ("Deep Learning" too broad)
// 3. [Minor] Length acceptable (9 words) but could be more specific
//
// Recommended Titles (Ranked):
//
// 1. "Transformer-Based Time Series Forecasting for Industrial Control" [Score: 92/100]
// - Concise: â
(8 words)
// - Searchable: â
(Method + Problem in first 50 chars)
// - Specific: â
(Transformer, not just "Deep Learning")
// - Domain: â
(Industrial Control)
//
// 2. "Attention Mechanisms for Multivariate Time Series Prediction" [Score: 88/100]
// - Concise: â
(7 words)
// - Searchable: â
(Key terms upfront)
// - Specific: â
(Attention, Multivariate)
// - Note: Consider adding domain if space allows
//
// 3. "Deep Learning Approach to Time Series Forecasting in Smart Manufacturing" [Score: 78/100]
// - Concise: â ï¸ (10 words, acceptable)
// - Searchable: â
// - Specific: â ï¸ ("Deep Learning" still broad)
// - Domain: â
(Smart Manufacturing)
//
// Keyword Analysis:
// - Primary: Transformer, Time Series, Forecasting
// - Secondary: Industrial Control, Attention, LSTM
// - Searchability: "Transformer Time Series" appears in 1,234 papers (good balance)
//
// Suggested Typst Update:
// #align(center)[
// #text(size: 18pt, weight: "bold")[
// Transformer-Based Time Series Forecasting for Industrial Control
// ]
// ]
// ============================================================
ä¸æè®ºæï¼
// ============================================================
// æ é¢ä¼åæ¥å
// ============================================================
// å½åæ é¢ï¼"å
³äºåºäºæ·±åº¦å¦ä¹ çæ¶é´åºå颿µçç ç©¶"
// è´¨éè¯åï¼48/100
//
// æ£æµå°çé®é¢ï¼
// 1. [严é] å
å«"å
³äº...çç ç©¶"ï¼å é¤åä½è¯æ±ï¼
// 2. [éè¦] æ¹æ³æè¿°è¿äºå®½æ³ï¼"深度å¦ä¹ "太笼ç»ï¼
// 3. [次è¦] é¿åº¦å¯æ¥åï¼18åï¼ä½å¯æ´å
·ä½
//
// æ¨èæ é¢ï¼æè¯åæåºï¼ï¼
//
// 1. "工䏿§å¶ç³»ç»æ¶é´åºå颿µçTransformeræ¹æ³" [è¯å: 94/100]
// - ç®æ´æ§ï¼â
(19å)
// - å¯æç´¢æ§ï¼â
(æ¹æ³+é®é¢å¨å15å)
// - å
·ä½æ§ï¼â
(Transformerï¼èé"深度å¦ä¹ ")
// - é¢åæ§ï¼â
(工䏿§å¶ç³»ç»)
//
// 2. "å¤åéæ¶é´åºå颿µç注æåæºå¶ç ç©¶" [è¯å: 89/100]
// - ç®æ´æ§ï¼â
(17å)
// - å¯æç´¢æ§ï¼â
(æ ¸å¿æ¯è¯é å)
// - å
·ä½æ§ï¼â
(注æåæºå¶ãå¤åé)
// - 建议ï¼å¯èèæ·»å åºç¨é¢å
//
// Suggested Typst Update:
// #align(center)[
// #text(size: 18pt, weight: "bold")[
// 工䏿§å¶ç³»ç»æ¶é´åºå颿µçTransformeræ¹æ³
// ]
// ]
// ============================================================
交äºå¼æ¨¡å¼ï¼æ¨èï¼ï¼
python scripts/optimize_title.py main.typ --interactive
# 鿥å¼å¯¼å¼æ é¢å建ï¼å
å«ç¨æ·è¾å
¥
æ¹é模å¼ï¼å¤ç¯è®ºæï¼ï¼
python scripts/optimize_title.py papers/*.typ --batch --output title_report.txt
æ é¢å¯¹æ¯æµè¯ï¼å¯éï¼ï¼
python scripts/optimize_title.py main.typ --compare "Title A" "Title B" "Title C"
# 对æ¯å¤ä¸ªæ é¢åéï¼æä¾è¯¦ç»è¯å
æä½³å®è·µæ»ç»ï¼
è±æè®ºæï¼
- å ³é®è¯åç½®ï¼Method + Problem æ¾å¨å 10 è¯
- å ·ä½æç¡®ï¼”Transformer” > “Deep Learning” > “Machine Learning”
- å é¤åä½ï¼å»æ “Novel”, “Study”, “Research”, “Based on”
- æ§å¶é¿åº¦ï¼ç®æ 10-15 è¯
- æµè¯å¯æç´¢æ§ï¼ç¨è¿äºå ³é®è¯è½æ¾å°ä½ ç论æåï¼
- é¿å çå»ï¼é¤éæ¯å¹¿æ³è®¤å¯ç缩åï¼AI, LSTM, CNNï¼
- å¹é ä¼è®®é£æ ¼ï¼IEEEï¼æè¿°æ§ï¼ãACMï¼åææ§ï¼ãNeurIPSï¼ç®æ´æ§ï¼
ä¸æè®ºæï¼
- å ³é®è¯åç½®ï¼æ¹æ³+é®é¢æ¾å¨å 20 å
- å ·ä½æç¡®ï¼”Transformer” > “深度å¦ä¹ ” > “æºå¨å¦ä¹ “
- å é¤åä½ï¼å»æ”å ³äº”ã”ç ç©¶”ã”æ°å”ã”åºäº”
- æ§å¶é¿åº¦ï¼ç®æ 15-25 å
- æµè¯å¯æç´¢æ§ï¼ç¨è¿äºå ³é®è¯è½æ¾å°ä½ ç论æåï¼
- é¿å çå»ï¼é¤éæ¯å¹¿æ³è®¤å¯çæ¯è¯ï¼AIãLSTMãCNNï¼
- ä¸è±å¯¹ç §ï¼ç¡®ä¿è±ææ é¢ä¸ä¸ææ é¢å¯¹åº
Typst æ é¢è®¾ç½®ç¤ºä¾ï¼
è±æè®ºæï¼
#align(center)[
#text(size: 18pt, weight: "bold")[
Transformer-Based Time Series Forecasting for Industrial Control
]
]
ä¸æè®ºæï¼
#align(center)[
#text(size: 18pt, weight: "bold", font: "Source Han Serif")[
工䏿§å¶ç³»ç»æ¶é´åºå颿µçTransformeræ¹æ³
]
#v(0.5em)
#text(size: 14pt, font: "Times New Roman")[
Transformer-Based Time Series Forecasting for Industrial Control Systems
]
]
åèèµæºï¼
模åï¼æ¨¡æ¿é ç½®
触åè¯: template, 模æ¿, IEEE, ACM, Springer, NeurIPS
模æ¿é 置示ä¾ä¸ç¨æ³å·²ç§»è³åèææ¡£ï¼
åè䏿©å±
ä¸ºä¿æ SKILL ç²¾ç®ä¸æç»´æ¤ï¼è¯¦ç»ç¤ºä¾ä¸æ©å±å 容移è³åèææ¡£ï¼
- æå/ä¼è®®è§åï¼
references/VENUES.md - Typst è¯æ³ä¸æçï¼
references/TYPST_SYNTAX.md - åä½é£æ ¼ä¸å¸¸è§é误ï¼
references/STYLE_GUIDE.mdãreferences/COMMON_ERRORS.md - å»AIåçç¥ï¼
references/DEAI_GUIDE.md - 模æ¿ç¤ºä¾ä¸é
ç½®ï¼
references/TEMPLATES.md
æä½³å®è·µ
æ¬æè½éµå¾ª Claude Code Skills æä½³å®è·µï¼
æè½è®¾è®¡åå / Skill Design Principles
- èè´£åä¸ / Focused Responsibilityï¼æ¯ä¸ªæ¨¡åå¤çä¸é¡¹ç¹å®ä»»å¡ï¼KISS ååï¼
- æå°æé / Minimal Permissionsï¼ä» 请æ±å¿ è¦çå·¥å ·è®¿é®æé
- æç¡®è§¦å / Clear Triggersï¼ä½¿ç¨ç¹å®å ³é®è¯è°ç¨æ¨¡å
- ç»æåè¾åº / Structured Outputï¼ææå»ºè®®ä½¿ç¨ç»ä¸ç diff-comment æ ¼å¼
ä½¿ç¨æå / Usage Guidelines
- å æ£æ¥ç¼è¯ / Start with Compilationï¼å¨è¿è¡å ¶ä»æ£æ¥åï¼ç¡®ä¿ææ¡£è½æ£å¸¸ç¼è¯
- è¿ä»£ä¼å / Iterative Refinementï¼æ¯æ¬¡åªåºç¨ä¸ä¸ªæ¨¡åï¼ä¾¿äºæ§å¶ä¿®æ¹èå´
- ä¿æ¤å
³é®å
ç´ / Preserve Protected Elementsï¼ç»ä¸ä¿®æ¹
@citeã@refã@labelãæ°å¦ç¯å¢ - æäº¤åéªè¯ / Verify Before Commitï¼æ¥åä¿®æ¹åä»ç»å®¡æ¥ææå»ºè®®
ä¸å ¶ä»å·¥å ·éæ / Integration with Other Tools
- é åçæ¬æ§å¶ï¼gitï¼è·è¸ªä¿®æ¹åå²
- 使ç¨
typst watchå®ç°å®æ¶é¢è§ï¼æ¯«ç§çº§ç¼è¯ï¼ - 导åºå»ºè®®ä¸åä½è å ±å审é
Typst ç¹æä¼å¿ / Typst-Specific Advantages
- ç¼è¯éåº¦ï¼æ¯«ç§çº§ç¼è¯ï¼éå宿¶é¢è§åå¿«éè¿ä»£
- ç°ä»£è¯æ³ï¼æ¯ LaTeX æ´ç®æ´ç´è§çæ è®°è¯è¨
- å¢éç¼è¯ï¼åªéæ°ç¼è¯ä¿®æ¹çé¨åï¼æé«æç
注æäºé¡¹
- åä½é®é¢ï¼ç¡®ä¿ç³»ç»å®è£ æéåä½ï¼ä¸æå»ºè®® Source Han Serif æ Noto Serif CJKï¼
- 模æ¿å ¼å®¹æ§ï¼é¨åæåå¯è½ä»è¦æ± LaTeX 模æ¿
- æ°å¦å ¬å¼ï¼Typst æ°å¦è¯æ³ä¸ LaTeX ç¥æå·®å¼ï¼éè¦éåº