content-hash-cache-pattern
4
总安装量
4
周安装量
#54328
全站排名
安装命令
npx skills add https://github.com/shimo4228/claude-code-learned-skills --skill content-hash-cache-pattern
Agent 安装分布
replit
4
openclaw
4
mcpjam
2
claude-code
2
windsurf
2
zencoder
2
Skill 文档
Content-Hash File Cache Pattern
ã³ã³ãã³ãããã·ã¥ãã£ãã·ã¥ãã¿ã¼ã³
Extracted / æ½åºæ¥: 2026-02-10 Context / ã³ã³ããã¹ã: ãã¡ã¤ã«å¦ççµæãSHA-256ããã·ã¥ã§ãã£ãã·ã¥ãããµã¼ãã¹å±¤ã§ã©ãããããã¿ã¼ã³
Problem / 課é¡
ãã¡ã¤ã«å¦çï¼PDFè§£æãããã¹ãæ½åºçï¼ã¯æéããããããåããã¡ã¤ã«ã®åå¦çã¯ç¡é§ï¼
# WRONG: æ¯åãã«ãã¤ãã©ã¤ã³å®è¡
def process_file(path: Path) -> Result:
return expensive_extraction(path) # Always re-runs
# WRONG: ãã¹ãã¼ã¹ãã£ãã·ã¥ï¼ãã¡ã¤ã«ç§»åã§ç¡å¹åï¼
cache = {"/path/to/file.pdf": result} # Path changes â cache miss
# WRONG: æ¢å颿°ã«ãã£ãã·ã¥ãã©ã¡ã¼ã¿è¿½å ï¼SRPéåï¼
def extract_text(path, *, cache_enabled=False, cache_dir=None):
if cache_enabled: # Extraction function now has cache responsibility
...
Solution / 解決ç
1. Content-Hash Based Cache Key
ãã¡ã¤ã«ãã¹ã§ã¯ãªããã¡ã¤ã«å 容ã®SHA-256ããã·ã¥ããã¼ã«ä½¿ãï¼
import hashlib
from pathlib import Path
_HASH_CHUNK_SIZE = 65536 # 64KB chunks for large files
def compute_file_hash(path: Path) -> str:
"""SHA-256 of file contents (chunked for large files)."""
if not path.is_file():
raise FileNotFoundError(f"File not found: {path}")
sha256 = hashlib.sha256()
with open(path, "rb") as f:
while True:
chunk = f.read(_HASH_CHUNK_SIZE)
if not chunk:
break
sha256.update(chunk)
return sha256.hexdigest()
å©ç¹: ãã¡ã¤ã«ç§»åã»ãªãã¼ã ã§ããã£ãã·ã¥ããããå 容夿´ã§èªåç¡å¹å
2. Frozen Dataclass for Cache Entry
from dataclasses import dataclass
@dataclass(frozen=True, slots=True)
class CacheEntry:
file_hash: str
source_path: str
document: ExtractedDocument # The cached result
3. JSON Serialization of Frozen Dataclasses
dataclasses.asdict() ã¯ãã¹ãããfrozen dataclassã§åé¡ãèµ·ãããããæåãããã³ã°ï¼
import json
from typing import Any
def _serialize_entry(entry: CacheEntry) -> dict[str, Any]:
"""Manual mapping for full control over serialized format."""
doc = entry.document
return {
"file_hash": entry.file_hash,
"source_path": entry.source_path,
"document": {
"text": doc.text,
"chunks": list(doc.chunks), # tuple â list for JSON
"file_type": doc.file_type,
# ... other fields
},
}
def _deserialize_entry(data: dict[str, Any]) -> CacheEntry:
doc_data = data["document"]
document = ExtractedDocument(
text=doc_data["text"],
chunks=tuple(doc_data["chunks"]), # list â tuple
file_type=doc_data["file_type"],
)
return CacheEntry(
file_hash=data["file_hash"],
source_path=data["source_path"],
document=document,
)
4. Service Layer Wrapper (SRP)
ç´ç²ãªå¦ç颿°ã夿´ããããµã¼ãã¹å±¤ã§ãã£ãã·ã¥ãã¸ãã¯ãã©ããï¼
# service.py â cache wrapper
def extract_with_cache(file_path: Path, *, config: AppConfig) -> ExtractedDocument:
"""Service layer: cache check â extraction â cache write."""
if not config.cache_enabled:
return extract_text(file_path) # Pure function, no cache knowledge
cache_dir = Path(config.cache_dir)
file_hash = compute_file_hash(file_path)
# Check cache
cached = read_cache(cache_dir, file_hash)
if cached is not None:
logger.info("Cache hit: %s (hash=%s)", file_path.name, file_hash[:12])
return cached.document
# Cache miss â extract â store
logger.info("Cache miss: %s (hash=%s)", file_path.name, file_hash[:12])
doc = extract_text(file_path)
entry = CacheEntry(file_hash=file_hash, source_path=str(file_path), document=doc)
write_cache(cache_dir, entry)
return doc
5. Graceful Corruption Handling
def read_cache(cache_dir: Path, file_hash: str) -> CacheEntry | None:
cache_file = cache_dir / f"{file_hash}.json"
if not cache_file.is_file():
return None
try:
raw = cache_file.read_text(encoding="utf-8")
data = json.loads(raw)
return _deserialize_entry(data)
except (json.JSONDecodeError, ValueError, KeyError):
logger.warning("Corrupted cache entry: %s", cache_file)
return None # Treat corruption as cache miss
Key Design Choices / è¨è¨ä¸ã®ãã¤ã³ã
| Choice / 鏿 | Reason / çç± |
|---|---|
| SHA-256 content hash | Path-independent, auto-invalidates on content change |
{hash}.json file naming |
O(1) lookup, no index file needed |
| Service layer wrapper | SRP: extraction stays pure, cache is separate concern |
| Manual JSON serialization | Full control over frozen dataclass serialization |
| Corruption â None | Graceful degradation, re-extracts on next run |
cache_dir.mkdir(parents=True) |
Lazy directory creation on first write |
When to Use / 使ç¨ãã¹ãå ´é¢
- ãã¡ã¤ã«å¦çãã¤ãã©ã¤ã³ï¼PDFè§£æãç»åå¦çãããã¹ãæ½åºï¼
- å¦çã³ã¹ããé«ããåä¸ãã¡ã¤ã«ã®åå¦çãé »ç¹ãªå ´å
- CLI ãã¼ã«ã§
--cache/--no-cacheãªãã·ã§ã³ãå¿ è¦ãªå ´å - æ¢åã®ç´ç²é¢æ°ã«ãã£ãã·ã¥ã追å ããå ´åï¼SRPç¶æï¼
When NOT to Use / 使ç¨ãã¹ãã§ãªãå ´é¢
- ãªã¢ã«ã¿ã¤ã æ´æ°ãå¿ è¦ãªãã¼ã¿ï¼å¸¸ã«ææ°ãå¿ è¦ï¼
- ãã£ãã·ã¥ã¨ã³ããªãé常ã«å¤§ããå ´åï¼ã¡ã¢ãª/ãã£ã¹ã¯å§è¿«ï¼
- å¦ççµæããã¡ã¤ã«å 容以å¤ã®ãã©ã¡ã¼ã¿ã«ä¾åããå ´åï¼è¨å®å¤æ´ã§ãã£ãã·ã¥ç¡å¹åãå¿ è¦ï¼
Related Patterns / é¢é£ãã¿ã¼ã³
python-immutable-accumulator.mdâ frozen dataclass + slotsãã¿ã¼ã³backward-compatible-frozen-extension.mdâ frozen dataclassæ¡å¼µcost-aware-llm-pipeline.mdâ LLMãã¤ãã©ã¤ã³ã§ã®ãã£ãã·ã¥æ´»ç¨