integrating-ai

📁 pauloviccs/viccs_antigravity_skillscreator 📅 Today
1
总安装量
1
周安装量
#78393
全站排名
安装命令
npx skills add https://github.com/pauloviccs/viccs_antigravity_skillscreator --skill integrating-ai

Agent 安装分布

amp 1
cline 1
openclaw 1
trae 1
opencode 1
cursor 1

Skill 文档

Integrating AI (Vercel AI SDK & Ollama)

When to use this skill

  • When the user mentions “AI”, “LLM”, “Chatbot”, “GPT”, or “Ollama”.
  • When building features like “Explain this”, “Generate text”, or “Chat with PDF”.
  • When implementing streaming text responses.

Workflow

  1. Installation:
    • npm install ai @ai-sdk/openai @ai-sdk/anthropic ollama-ai-provider
  2. Backend Route (Edge/Serverless):
    • Create an API route using streamText from ai.
  3. Frontend Hook:
    • Use useChat or useCompletion from ai/react to handle UI state automatically.

Instructions

1. API Route (Next.js App Router)

app/api/chat/route.ts

import { openai } from '@ai-sdk/openai'; // or 'ollama-ai-provider'
import { streamText } from 'ai';

export const maxDuration = 30;

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: openai('gpt-4-turbo'), // or ollama('llama3')
    messages,
    system: 'You are a helpful assistant.',
  });

  return result.toDataStreamResponse();
}

2. Frontend UI (React)

components/Chat.tsx

'use client';
import { useChat } from 'ai/react';

export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat();

  return (
    <div>
      {messages.map(m => (
        <div key={m.id}>
          <strong>{m.role}:</strong> {m.content}
        </div>
      ))}
      <form onSubmit={handleSubmit}>
        <input value={input} onChange={handleInputChange} />
      </form>
    </div>
  );
}

3. Using Ollama (Local Models)

To run entirely open-source/local:

  1. Run Ollama locally: ollama run llama3

  2. Change the model provider in the API route:

    import { ollama } from 'ollama-ai-provider';
    // ...
    model: ollama('llama3'),
    

Resources