feat: add OpenRouter + direct OpenAI as alternative LLM providers#442
feat: add OpenRouter + direct OpenAI as alternative LLM providers#442bettercallzaal wants to merge 3 commits intorecoupable:testfrom
Conversation
Adds a `createModel()` helper that resolves model strings through whichever provider is configured, with priority: 1. Vercel AI Gateway (existing production behavior) 2. OpenRouter (one key, many models) 3. Direct OpenAI (strip provider prefix) This lets contributors run the app locally without needing a Vercel AI Gateway API key — just set OPENROUTER_API_KEY or OPENAI_API_KEY instead. All AI SDK call sites updated to use createModel(). No changes to direct OpenAI API calls (video/transcription/image generation) which remain provider-specific. Also fixes bare model string "gemini-2.5-pro" to use the correct "google/gemini-2.5-pro" provider-prefixed format. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
@bettercallzaal is attempting to deploy a commit to the Recoupable Team on Vercel. A member of the Team first needs to authorize it. |
|
Warning Rate limit exceeded
Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 23 minutes and 33 seconds. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: ⛔ Files ignored due to path filters (4)
📒 Files selected for processing (10)
✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
4 issues found across 15 files
Confidence score: 3/5
- There is concrete regression risk in fallback behavior:
createModel("google/gemini-2.5-pro")can be passed directly to OpenAI inlib/chat/toolChains/toolChains.ts, which is likely to fail at runtime with an unknown-model error when onlyOPENAI_API_KEYis configured. - A similarly high-impact path exists in
lib/agents/ImageGenerationAgent/createImageGenerationAgent.ts, where the provider-specificgoogle/gemini-3-pro-imagecan be routed through direct OpenAI and fail for custom image-generation flows. lib/ai/createModel.tscurrently allows non-OpenAI model IDs to reach the OpenAI SDK, so errors are confusing and harder to diagnose instead of failing early with a clear message; this keeps risk in the medium range rather than merge-blocking.- Pay close attention to
lib/chat/toolChains/toolChains.ts,lib/agents/ImageGenerationAgent/createImageGenerationAgent.ts,lib/ai/createModel.ts,lib/ai/getAvailableModels.ts- provider routing and fallback error handling may mask or trigger runtime failures.
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="lib/chat/toolChains/toolChains.ts">
<violation number="1" location="lib/chat/toolChains/toolChains.ts:22">
P1: When only `OPENAI_API_KEY` is set, `createModel("google/gemini-2.5-pro")` passes `"google/gemini-2.5-pro"` verbatim to the OpenAI SDK, which will reject it as an unknown model. The direct-OpenAI fallback in `createModel` only strips the `openai/` prefix and has no mapping or error for non-OpenAI model IDs.</violation>
</file>
<file name="lib/ai/getAvailableModels.ts">
<violation number="1" location="lib/ai/getAvailableModels.ts:36">
P2: When the gateway key is set but the API call fails, returning `DEFAULT_MODELS` silently masks gateway errors. Consider logging the error and/or returning `[]` to preserve the previous failure-is-visible semantics, so operators notice gateway misconfiguration.</violation>
</file>
<file name="lib/ai/createModel.ts">
<violation number="1" location="lib/ai/createModel.ts:26">
P2: Non-OpenAI model IDs (e.g. `"google/gemini-2.5-pro"`) silently pass through to the OpenAI SDK, producing a confusing API error. Consider throwing an explicit error when the direct-OpenAI fallback receives a model that isn't prefixed with `"openai/"`.</violation>
</file>
<file name="lib/agents/ImageGenerationAgent/createImageGenerationAgent.ts">
<violation number="1" location="lib/agents/ImageGenerationAgent/createImageGenerationAgent.ts:11">
P1: Custom agent: **Flag AI Slop and Fabricated Changes**
This Google-specific image generation model (`google/gemini-3-pro-image`) is provider-dependent and will fail when routed through direct OpenAI (`openai("google/gemini-3-pro-image")`). The PR description itself acknowledges image generation is provider-specific and "can't route through OpenRouter," yet this file was included in the mechanical `createModel()` sweep. Keep the original string literal here, or guard this call site so it only resolves via the gateway.</violation>
</file>
Architecture diagram
sequenceDiagram
participant Feature as AI Features (Agents, Utilities, Evals)
participant Factory as NEW: createModel() (lib/ai/createModel.ts)
participant SDK as AI SDK / Providers
participant Gateway as Vercel AI Gateway
participant OpenRouter as NEW: OpenRouter API
participant OpenAI as NEW: OpenAI API
Note over Feature, OpenAI: LLM Request Flow (Priority Cascade)
Feature->>Factory: createModel(modelId)
alt VERCEL_AI_GATEWAY_API_KEY is set
Factory->>SDK: Use gateway(modelId)
SDK-->>Factory: LanguageModel (Gateway)
else OPENROUTER_API_KEY is set
Factory->>SDK: NEW: createOpenRouter()
Factory->>SDK: Use openrouter(modelId)
SDK-->>Factory: LanguageModel (OpenRouter)
else OPENAI_API_KEY is set
Factory->>Factory: NEW: Strip "openai/" prefix if present
Factory->>SDK: Use openai(bareModel)
SDK-->>Factory: LanguageModel (Direct)
else No keys configured
Factory-->>Feature: Throw Error
end
Factory-->>Feature: Return LanguageModel
Feature->>SDK: generateText() / ToolLoopAgent / generateObject()
alt Routing via Gateway
SDK->>Gateway: POST /v1/chat/completions
Gateway-->>SDK: Response
else Routing via OpenRouter
SDK->>OpenRouter: NEW: POST /api/v1/chat/completions
OpenRouter-->>SDK: Response
else Routing via Direct OpenAI
SDK->>OpenAI: NEW: POST /v1/chat/completions
OpenAI-->>SDK: Response
end
SDK-->>Feature: Final AI Output (Text/Object/Tool Call)
Note over Feature, Gateway: Model Discovery Flow
Feature->>SDK: CHANGED: getAvailableModels()
alt VERCEL_AI_GATEWAY_API_KEY is set
SDK->>Gateway: Fetch available models
Gateway-->>Feature: Model list
else Gateway not configured
SDK-->>Feature: NEW: Return hardcoded DEFAULT_MODELS list
end
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review, or fix all with cubic.
| web_deep_research: "openai/gpt-5.4-mini", | ||
| create_knowledge_base: "openai/gpt-5.4-mini", | ||
| send_email: "openai/gpt-5.4-mini", | ||
| update_account_info: createModel("google/gemini-2.5-pro"), |
There was a problem hiding this comment.
P1: When only OPENAI_API_KEY is set, createModel("google/gemini-2.5-pro") passes "google/gemini-2.5-pro" verbatim to the OpenAI SDK, which will reject it as an unknown model. The direct-OpenAI fallback in createModel only strips the openai/ prefix and has no mapping or error for non-OpenAI model IDs.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At lib/chat/toolChains/toolChains.ts, line 22:
<comment>When only `OPENAI_API_KEY` is set, `createModel("google/gemini-2.5-pro")` passes `"google/gemini-2.5-pro"` verbatim to the OpenAI SDK, which will reject it as an unknown model. The direct-OpenAI fallback in `createModel` only strips the `openai/` prefix and has no mapping or error for non-OpenAI model IDs.</comment>
<file context>
@@ -18,22 +19,22 @@ export type PrepareStepResult = {
- web_deep_research: "openai/gpt-5.4-mini",
- create_knowledge_base: "openai/gpt-5.4-mini",
- send_email: "openai/gpt-5.4-mini",
+ update_account_info: createModel("google/gemini-2.5-pro"),
+ get_spotify_search: createModel("openai/gpt-5.4-mini"),
+ update_artist_socials: createModel("openai/gpt-5.4-mini"),
</file context>
- Revert createImageGenerationAgent to use bare model string since google/gemini-3-pro-image is provider-specific and can't route through OpenRouter or direct OpenAI - Add explicit error when direct OpenAI fallback receives a non-OpenAI model (e.g. "google/..." or "anthropic/...") instead of silently passing it to the OpenAI API Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Add VERCEL_OIDC_TOKEN check to gateway detection in createModel - Return [] on gateway failure instead of DEFAULT_MODELS to avoid masking gateway errors - Add pricing metadata to fallback model entries Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
| if (process.env.OPENROUTER_API_KEY) { | ||
| const openrouter = createOpenRouter({ apiKey: process.env.OPENROUTER_API_KEY }); | ||
| return openrouter(modelId); | ||
| } |
There was a problem hiding this comment.
KISS principle
- Why add OpenRouter when we already support AI Gateway?
Summary
createModel()helper (lib/ai/createModel.ts) that resolves model strings through whichever provider is configuredcreateModel()(15 files)@openrouter/ai-sdk-providerdependencygetAvailableModels()for non-gateway environments"gemini-2.5-pro"→"google/gemini-2.5-pro"inTOOL_MODEL_MAPMotivation
Contributors without Vercel AI Gateway access can't run the chat/agent features locally. This lets devs run with just an
OPENROUTER_API_KEY(one key, many models) or a directOPENAI_API_KEY.No breaking changes — existing production deployments using the gateway are unaffected.
What's NOT changed
"ai"packageTest plan
VERCEL_AI_GATEWAY_API_KEY→ verify existing behavior unchangedOPENROUTER_API_KEY→ verify chat/agents work through OpenRouterOPENAI_API_KEY→ verify chat/agents work through OpenAI directlypnpm buildpassespnpm testpasses🤖 Generated with Claude Code
Summary by cubic
Add OpenRouter and direct OpenAI as alternative LLM providers via a new
createModel()helper, so local dev works without the Vercel AI Gateway. Existing gateway behavior and media generation APIs remain unchanged.New Features
createModel(modelId)routes by priority: Vercel AI Gateway → OpenRouter → OpenAI; updated AI SDK call sites to use it.getAvailableModels()returns a small default list when no gateway is configured, with pricing metadata.@openrouter/ai-sdk-provider.Bug Fixes
openai/*models.VERCEL_OIDC_TOKEN.getAvailableModels()now returns[]instead of the default list.Written for commit 8f6b5b1. Summary will update on new commits.