AI Chat Integration
Lightweight chat state via Context + Provider. UI-only — plug your own API.
Provider
File: src/providers/chat-provider.tsx
Exposes: messages, addMessage(message), clearMessages()
import { ChatProvider, useChat } from "@/providers/chat-provider"
function Ask() {
const { messages, addMessage, clearMessages } = useChat()
return (
<button onClick={() => addMessage({ role: "user", content: "Find flights to Tokyo" })}>
Ask AI
</button>
)
}Wrap your app (e.g., in App.tsx or layout) with <ChatProvider>.
Hook
src/hooks/use-chat.ts wraps provider access and throws if used outside the provider.
import { useContext } from "react"
import { ChatContext } from "@/contexts/chat-context"
export function useChat() {
const ctx = useContext(ChatContext)
if (!ctx) throw new Error("useChat must be used within a ChatProvider")
return ctx
}UI Container
<ChatContainer /> handles auto-scroll, typing effect, and rendering.
// Minimal usage
import { ChatContainer } from "@/components/custom/chat-container"
const ref = useRef<HTMLDivElement | null>(null)
<ChatContainer messages={messages} containerRef={ref} />Connect Your Backend
Replace the internal request with your API call, then append streamed chunks:
// pseudo
async function sendToAI(input: string) {
addMessage({ role: "user", content: input })
const res = await fetch("/api/chat", { method: "POST", body: JSON.stringify({ input }) })
const reader = res.body!.getReader()
let partial = ""
for (;;) {
const { done, value } = await reader.read()
if (done) break
partial += new TextDecoder().decode(value)
addMessage({ role: "assistant", content: partial })
}
}Tip: For non-streamed responses, add once when the request resolves.
Notes
- UI tokens: uses
bg-bubble,bg-layout,text-foreground/*for theming. - Keep messages typed (
role: "user" | "assistant"). - Clear state with
clearMessages()when starting a new session.
Last updated on