Loading...
This guide covers the Flash Turbo CMS AI Chat system, including:
streamText, useChat, MCP integration)Access AI Chat Manager
/manager/ai-chatUse Quick Actions
Export Chat History
Add AI Chat Overlay to Page
API Endpoint
/api/ai/chat-mcpReact Component Usage
import { useChat } from "@ai-sdk/react";
import { DefaultChatTransport } from "@ai-sdk/react";
export function ChatInterface() {
const { messages, status, sendMessage } = useChat({
transport: new DefaultChatTransport({
api: "/api/ai/chat-mcp",
}),
});
return (
<div>
{messages.map((msg) => (
<div key={msg.id}>
{msg.parts.map((part) => {
if (part.type === "text") return <p>{part.text}</p>;
if (part.type === "tool-call")
return <div className="tool-call">{part.toolName}</div>;
if (part.type === "tool-result")
return <div className="tool-result">{JSON.stringify(part.result)}</div>;
})}
</div>
))}
<input
onKeyDown={(e) => {
if (e.key === "Enter") {
sendMessage({
parts: [{ type: "text", text: e.currentTarget.value }]
});
}
}}
/>
</div>
);
}
/api/ai/chat-mcpapps/multi-store/src/app/(misc)/api/ai/chat-mcp/route.ts/manager/ai-chatapps/multi-store/src/app/(manager)/[slug]/manager/ai-chat/page.tsxapps/multi-store/src/components/ai/overlay/AIChatOverlayModern.tsxapps/multi-store/src/blocks/AIChatOverlayModern/config.tsapps/multi-store/src/components/manager/ManagerSidebar.tsx/api/llm/mcp1. User sends message to /api/ai/chat-mcp
2. Route creates MCP client connection
3. Fetches available tools from MCP server
4. Sends messages + tools to LLM
5. LLM decides which tools to call
6. Tools execute via MCP server
7. Results returned to LLM
8. Response streamed back to client
type Message = {
id: string;
role: "user" | "assistant";
content?: string;
parts?: MessagePart[];
};
type MessagePart =
| { type: "text"; text: string }
| { type: "tool-call"; toolName: string; args: any; id?: string }
| { type: "tool-result"; toolName: string; result: any; id?: string }
| { type: "file"; name: string; url: string; mimeType?: string };
User Text Message
{
"type": "text",
"text": "Show me all products with 'organic' in the name"
}
Tool Call (Assistant)
{
"type": "tool-call",
"toolName": "find_products",
"args": { "search": "organic", "limit": 10 },
"id": "call_123"
}
Tool Result
{
"type": "tool-result",
"toolName": "find_products",
"result": {
"docs": [...],
"totalDocs": 5
},
"id": "call_123"
}
streamText from ai packageuseChat hook for UIMigration: Both systems can run in parallel. Legacy system marked as "Legacy" in sidebar. Gradual user migration over 1-2 weeks.
User: "Show me all products"
Tool Call: find_products({ limit: 20 })
Tool Result: Returns 20 products with details
Response: "Here are the products in your system..."
User: "Create an item called 'Organic Coffee' with SKU 'ORG-001'"
Tool Call: create_item({
name: "Organic Coffee",
sku: "ORG-001"
})
Tool Result: Returns created item with ID
Response: "Created item 'Organic Coffee' (ID: xyz)"
User: "How many products contain 'coffee'?"
Tool Call: find_products({ search: "coffee" })
Tool Result: Returns matching products
Response: "Found 5 products containing 'coffee'"
OPENAI_API_KEY=sk-... # OpenAI API key for LLM
MCP_SERVER_URL=/api/llm/mcp # MCP server endpoint
AI_CHAT_MODEL=gpt-4o # LLM model to use
AI_CHAT_TIMEOUT=30000 # Tool timeout in ms
Colors
Icons
Add pre-configured prompts in config:
const quickActions = [
{ label: "All Products", prompt: "Show me all products" },
{ label: "Statistics", prompt: "Give me system statistics" },
// Add more...
];
bun test docs/ai-chat/ # Run AI chat tests
API Route
/api/ai/chat-mcp returns streaming responseManager Page
/manager/ai-chatOverlay Component
# 1. Start dev server
bun run dev
# 2. Navigate to /manager/ai-chat
# 3. Test basic message
# 4. Test quick action
# 5. Test export
# 6. Test overlay on test page
/api/ai/chat-mcp/api/llm/mcpAI_CHAT_MODERN_GUIDE.md (archive)AI_CHAT_MCP_SERVER_SETUP.md (archive)AI_CHAT_WORK_PENDING.md (archive)Last Updated: October 27, 2025 (Consolidated) Version: 1.1.0 (Merged from 6 files) Status: ✅ Production Ready Maintenance: Review quarterly for deprecated features