Give your AI agents fluent multilingual support.
Fluent is an open-source control plane that takes your English-built AI agent and makes it work beautifully in every language. It uses Lingo.dev to generate locale-specific runtime guidance — glossaries, FAQ overrides, escalation rules, and system instructions — then measures the actual improvement with automated evaluations.
Your agent in English → Fluent + Lingo.dev → Your agent in every language
- Create an agent — Define your agent's instructions, model, and target locales.
- Run a baseline — Fluent evaluates your agent across locales and finds where it breaks.
- Generate fixes — Lingo.dev analyzes the failures and generates locale-specific improvements (glossary, FAQ overrides, tone, escalation rules).
- Replay — Re-run the same evaluation with fixes applied and measure the uplift.
- Publish — Ship the improved agent as an immutable version your app can consume.
Lingo.dev is the engine behind Fluent's locale-specific fixes. When you click "Generate localized fixes", Fluent sends your agent's context to Lingo.dev, which returns:
- Localized system instructions — Tone, formality, and phrasing adapted for the target locale
- Glossary locks — Branded terms and operational language that must stay consistent
- FAQ overrides — Locale-specific answers for common questions
- Escalation rules — When and how to hand off to humans in each locale
- Forbidden terms — Words and phrases to avoid in each language
These aren't generic translations. Lingo.dev understands your agent's domain and generates guidance that actually improves answer quality.
# Install
bun install
# Set up environment
cp .env.example .env.localAdd your keys to .env.local:
CONVEX_DEPLOYMENT=your-deployment
NEXT_PUBLIC_CONVEX_URL=https://your-deployment.convex.cloud
LINGO_API_KEY=your-lingo-dev-key # Powers locale fix generation
OPENAI_API_KEY=your-openai-key # For agent invocation# Start
bun run devInstall the SDK:
npm install @fluent/sdkDrop in the chat component:
import { FluentAgentChat } from "@fluent/sdk";
<FluentAgentChat
agentId="your-agent-id"
apiBaseUrl="http://localhost:3000"
locale="ja"
/>Or use the headless client:
import { createFluentClient } from "@fluent/sdk";
const fluent = createFluentClient({ apiBaseUrl: "http://localhost:3000" });
const result = await fluent.invokePublishedAgent("your-agent-id", {
locale: "ja",
messages: [{ role: "user", content: "How do I reset my password?" }],
});
console.log(result.text); // Localized response
console.log(result.interventionSummary); // What Lingo.dev fixes were appliedOr bring your own LLM — extract Fluent's improvements as a system prompt:
import { createFluentClient, composeSystemPrompt } from "@fluent/sdk";
const fluent = createFluentClient({ apiBaseUrl: "..." });
const agent = await fluent.getPublishedAgent("your-agent-id");
const pack = agent.version.packs.find(p => p.locale === "ja");
// Use with any LLM provider
const systemPrompt = composeSystemPrompt(pack, {
baseInstructions: agent.agent.baseInstructions,
locale: "ja",
});apps/web/ Next.js dashboard + API routes
packages/sdk/ SDK with typed client, React component, and prompt composition
convex/ Schema, queries, mutations, and backend orchestration
English (source), Spanish, Japanese, Hindi — with more coming via Lingo.dev.
bun run typecheck # Type check
bun run build # Production build- Lingo.dev — Locale-specific fix generation
- Next.js 15 — Dashboard and API
- Convex — Real-time backend
- OpenAI / Ollama — LLM providers
MIT