langchain-use
11
总安装量
8
周安装量
#28570
全站排名
安装命令
npx skills add https://github.com/nanmicoder/claude-code-skills --skill langchain-use
Agent 安装分布
opencode
6
gemini-cli
4
codex
3
amp
2
openclaw
2
Skill 文档
LangChain Use Skill
LangChain æ¯æå»º LLM 驱å¨çæºè½ä½ååºç¨ç¨åºç弿ºæ¡æ¶ã
å®è£
ä½¿ç¨ uv å®è£ LangChainï¼æ¨èï¼éè¦ Python 3.10+ï¼ï¼
# å®è£
æ ¸å¿å
uv add langchain
# å®è£
模åæä¾åéæ
uv add langchain-anthropic # Anthropic/Claude
uv add langchain-openai # OpenAI
å¿«éåè
æ ¸å¿å·¥ä½æµç¨
ç¨æ·æ¥è¯¢ -> create_agent() -> ReAct å¾ªç¯ -> Tool è°ç¨ -> è¿åç»æ
å建 Agent
è¯¦è§ Agent åºç¡
from langchain.agents import create_agent
agent = create_agent(
model="claude-sonnet-4-5-20250929",
tools=[get_weather],
system_prompt="You are a helpful assistant",
)
# è¿è¡ agent
result = agent.invoke(
{"messages": [{"role": "user", "content": "what is the weather in sf"}]}
)
å®ä¹ Tool
è¯¦è§ Tool åºç¡
from langchain.tools import tool
@tool
def get_weather(city: str) -> str:
"""Get weather for a given city."""
return f"It's always sunny in {city}!"
è®¿é® Runtime Context
ä½¿ç¨ ToolRuntime è®¿é® stateãcontextãstoreï¼
from langchain.tools import tool, ToolRuntime
from dataclasses import dataclass
@dataclass
class Context:
user_id: str
@tool
def get_user_location(runtime: ToolRuntime[Context]) -> str:
"""Retrieve user location based on user ID."""
user_id = runtime.context.user_id
return "Florida" if user_id == "1" else "SF"
管ç Memory
è¯¦è§ çæè®°å¿ è¯¦è§ é¿æè®°å¿
from langgraph.checkpoint.memory import InMemorySaver
agent = create_agent(
model,
tools,
checkpointer=InMemorySaver(), # çæè®°å¿
)
# ä½¿ç¨ thread_id ç»´æ¤ä¼è¯
config = {"configurable": {"thread_id": "1"}}
agent.invoke({"messages": [...]}, config)
æ·»å Middleware
è¯¦è§ ä¸é´ä»¶æ¦è¿°
from langchain.agents.middleware import before_model, after_model
@before_model
def trim_messages(state, runtime):
# æ¶æ¯ä¿®åªé»è¾
return None
è¿é¶ä¸»é¢
| ä¸»é¢ | ææ¡£ | 说æ |
|---|---|---|
| Streaming | Streaming | 宿¶è¾åºæµå¼æ´æ° |
| Structured Output | ç»æåè¾åº | Pydantic/dataclass è¾åºæ ¼å¼ |
| Runtime | Runtime | ToolRuntime åä¸ä¸æè®¿é® |
| Guardrails | å®å ¨æ¤æ | PII æ£æµãå å®¹è¿æ»¤ |
| MCP | MCP | Model Context Protocol éæ |
éæä¸»é¢
| ä¸»é¢ | ææ¡£ | 说æ |
|---|---|---|
| Models | 模å | 夿ä¾å模ååå§å |
| Messages | æ¶æ¯ | æ¶æ¯ç±»ååå 容å |
| Retrieval | æ£ç´¢ | RAG åç¥è¯åºæå»º |
å ³é®æ¦å¿µ
Agent (æºè½ä½)
LangChain 1.0 çæ ¸å¿æ½è±¡ï¼åºäº LangGraph æå»ºãä½¿ç¨ create_agent() å建ã
Tool (å·¥å ·)
ä½¿ç¨ @tool è£
饰å¨å®ä¹ãå¯é ToolRuntime 访é®ç¶æãä¸ä¸æååå¨ã
Memory (è®°å¿)
- Checkpointer: çæä¼è¯è®°å¿ (InMemorySaver, PostgresSaver)
- Store: é¿ææä¹ ååå¨ (InMemoryStore)
Middleware (ä¸é´ä»¶)
è£ é¥°å¨é£æ ¼çæ©å±æºå¶ï¼
@before_model– 模åè°ç¨åå¤ç@after_model– 模åè°ç¨åå¤ç@wrap_tool_call– å·¥å ·è°ç¨å è£@dynamic_prompt– å¨æç³»ç»æç¤º
å¸¸è§æ¨¡å¼
åºç¡ Agent
from langchain.agents import create_agent
agent = create_agent(
model="claude-sonnet-4-5-20250929",
tools=[my_tool],
system_prompt="You are helpful.",
)
result = agent.invoke(
{"messages": [{"role": "user", "content": "hello"}]}
)
带记å¿ç Agentï¼ä¼è¯æä¹ åï¼
from langgraph.checkpoint.memory import InMemorySaver
agent = create_agent(
model="claude-sonnet-4-5-20250929",
tools=[my_tool],
checkpointer=InMemorySaver(),
)
# ä½¿ç¨ thread_id æ è¯ä¼è¯
config = {"configurable": {"thread_id": "1"}}
agent.invoke(
{"messages": [{"role": "user", "content": "My name is Bob"}]},
config
)
agent.invoke(
{"messages": [{"role": "user", "content": "What's my name?"}]},
config
)
ç产ç¯å¢ Memoryï¼ä½¿ç¨æ°æ®åºï¼
from langgraph.checkpoint.postgres import PostgresSaver
DB_URI = "postgresql://postgres:postgres@localhost:5432/postgres"
with PostgresSaver.from_conn_string(DB_URI) as checkpointer:
checkpointer.setup() # èªå¨å建表
agent = create_agent(
model="claude-sonnet-4-5-20250929",
tools=[my_tool],
checkpointer=checkpointer,
)
带ä¸ä¸æç Agentï¼Runtime Contextï¼
from dataclasses import dataclass
from langchain.tools import tool, ToolRuntime
@dataclass
class Context:
user_id: str
@tool
def get_user_info(runtime: ToolRuntime[Context]) -> str:
"""Get user information."""
user_id = runtime.context.user_id
return f"User ID: {user_id}"
agent = create_agent(
model="claude-sonnet-4-5-20250929",
tools=[get_user_info],
context_schema=Context,
)
result = agent.invoke(
{"messages": [{"role": "user", "content": "get my info"}]},
context=Context(user_id="user_123")
)
å¸¦ç»æåè¾åºç Agent
from dataclasses import dataclass
from langchain.agents.structured_output import ToolStrategy
@dataclass
class Response:
answer: str
confidence: float
agent = create_agent(
model="claude-sonnet-4-5-20250929",
tools=[my_tool],
response_format=ToolStrategy(Response),
)
result = agent.invoke(
{"messages": [{"role": "user", "content": "what is 2+2?"}]}
)
print(result['structured_response'])
# Response(answer="4", confidence=1.0)
宿´ç¤ºä¾ï¼ç产级 Agentï¼
from dataclasses import dataclass
from langchain.agents import create_agent
from langchain.chat_models import init_chat_model
from langchain.tools import tool, ToolRuntime
from langgraph.checkpoint.memory import InMemorySaver
from langchain.agents.structured_output import ToolStrategy
@dataclass
class Context:
user_id: str
@dataclass
class ResponseFormat:
answer: str
confidence: float | None = None
@tool
def get_weather(city: str) -> str:
"""Get weather for a city."""
return f"Sunny in {city}"
@tool
def get_user_location(runtime: ToolRuntime[Context]) -> str:
"""Get user's location."""
return "San Francisco" if runtime.context.user_id == "1" else "Unknown"
# é
置模å
model = init_chat_model(
"claude-sonnet-4-5-20250929",
temperature=0.5,
max_tokens=1000
)
# å建 agent
agent = create_agent(
model=model,
system_prompt="You are a weather assistant.",
tools=[get_weather, get_user_location],
context_schema=Context,
response_format=ToolStrategy(ResponseFormat),
checkpointer=InMemorySaver(),
)
# è¿è¡ agent
config = {"configurable": {"thread_id": "1"}}
result = agent.invoke(
{"messages": [{"role": "user", "content": "What's the weather?"}]},
config=config,
context=Context(user_id="1")
)
èµæºé¾æ¥
- æ ¸å¿æ¦å¿µ – LangChain æ¦è¿°
- å¿«éå¼å§ – 10è¡ä»£ç å建 Agent
- 宿¹ææ¡£
- API åè