tanstack-ai-vue-skilld
npx skills add https://github.com/harlan-zw/vue-ecosystem-skills --skill tanstack-ai-vue-skilld
Agent 安装分布
Skill 文档
TanStack/ai @tanstack/ai-vue
Version: 0.5.4 (Feb 2026) Deps: @tanstack/ai-client@0.4.5 Tags: latest: 0.5.4 (Feb 2026)
References: Docs â API reference, guides ⢠GitHub Issues â bugs, workarounds, edge cases ⢠GitHub Discussions â Q&A, patterns, recipes ⢠Releases â changelog, breaking changes, new APIs
API Changes
This section documents version-specific API changes â prioritize recent major/minor releases.
-
BREAKING: Adapter functions split â v0.1.0 split monolithic adapters into activity-specific functions (e.g.,
openaiText('gpt-4o'),openaiImage()) to enable optimal tree-shaking source -
BREAKING: Options flattened â common parameters like
temperature,maxTokens, andtopPmoved from nestedoptionsobject to top-level configuration since v0.1.0 source -
BREAKING:
modelOptionsâproviderOptionsrenamed tomodelOptionsin v0.1.0 for clarity; contains model-specific configurations and is fully type-safe source -
BREAKING:
toServerSentEventsStreamâtoResponseStreamrenamed in v0.1.0; now returns aReadableStreaminstead of aResponse, requiring manual response creation source -
BREAKING: Embeddings removed â the
embedding()function and associated adapters were removed in v0.1.0 to focus on chat and agentic workflows source -
NEW:
statusproperty âuseChatadded astatusref in v0.4.0 to track the generation lifecycle:ready,submitted,streaming, orerrorsource -
NEW: Multimodal support â v0.5.0 introduced support for multiple modalities (images, audio, video, documents) via the
MultimodalContenttype insendMessagesource -
NEW:
agentLoopStrategyâ replacedmaxIterationswith a strategy pattern in v0.1.0, using helpers likemaxIterations(n),untilFinishReason(), orcombineStrategies()source -
NEW:
chatCompletion()â added in v0.1.0 for promise-based results without the automatic tool execution loop used bychat()source -
NEW: Tool Handling â
useChatexposedaddToolResultandaddToolApprovalResponsefor manual management of tool outputs and user approvals -
NEW:
toHttpStreamâ introduced in v0.1.0 to support newline-delimited JSON (NDJSON) streaming as an alternative to Server-Sent Events source -
NEW:
fetchHttpStreamâ connection adapter added to@tanstack/ai-clientfor consuming NDJSON streams inuseChatsource -
NEW:
geminiSpeech(experimental) â experimental text-to-speech support for Google Gemini models added in v0.5.0 source -
NEW: Video generation (experimental) â experimental support for video generation via
openaiVideoandfaladapters introduced in v0.1.0 source
Also changed: standard-schema support v0.2.0 · useId integration (Vue 3.5+) · initialMessages option · ToolCallManager class · fetchServerSentEvents adapter
Best Practices
- Import specific activity and adapter functions instead of entire namespaces to ensure optimal tree-shaking and minimize bundle size source
// Preferred
import { chat } from '@tanstack/ai'
import { openaiText } from '@tanstack/ai-openai'
// Avoid - pulls in all activities and adapters
import * as ai from '@tanstack/ai'
- Use
toServerSentEventsResponseon the server to automatically handle SSE headers, protocol framing, and the “[DONE]” termination chunk source
export async function POST(req: Request) {
const stream = chat({ adapter: openaiText('gpt-5.2'), messages })
return toServerSentEventsResponse(stream)
}
-
Prefer
fetchServerSentEventsorfetchHttpStreamconnection adapters inuseChatfor built-in protocol parsing and state synchronization source -
Define tools using
toolDefinitionwith Zod schemas to enable full end-to-end TypeScript inference and runtime validation source
const getWeather = toolDefinition({
name: 'get_weather',
inputSchema: z.object({ city: z.string() }),
outputSchema: z.object({ temp: z.number() })
})
-
Use
.client()implementations for browser-only operations and pass the basetoolDefinitionto the serverchat()call to trigger automatic execution source -
Group client tools with
clientTools()andcreateChatClientOptions()to enable precise type narrowing for tool names and schemas inmessagessource
const tools = clientTools(uiTool.client(fn), storageTool.client(fn))
const options = createChatClientOptions({ connection, tools })
const { messages } = useChat(options) // messages parts are now narrowed!
- Pass the model name directly to the adapter factory to enable model-specific type safety and autocomplete for
modelOptionssource
// TypeScript enforces options supported only by gpt-5
const stream = chat({
adapter: openaiText('gpt-5'),
modelOptions: { text: { type: 'json_schema', ... } }
})
-
Subscribe to
aiEventClientwith{ withEventTarget: true }in production to capture internal events for observability and timeline reconstruction source -
Pass all related tools to a single
chat()call to allow the model to autonomously manage multi-step reasoning cycles (Agentic Cycle) source -
Leverage Vue’s reactivity by passing a reactive object to the
bodyproperty ofuseChatto update request parameters without recreating the client
const model = ref('gpt-5.2')
const { sendMessage } = useChat({
connection,
body: computed(() => ({ model: model.value }))
})