Skip to content

feat: add OpenRouter + direct OpenAI as alternative LLM providers#442

Open
bettercallzaal wants to merge 3 commits intorecoupable:testfrom
bettercallzaal:feat/openrouter-provider
Open

feat: add OpenRouter + direct OpenAI as alternative LLM providers#442
bettercallzaal wants to merge 3 commits intorecoupable:testfrom
bettercallzaal:feat/openrouter-provider

Conversation

@bettercallzaal
Copy link
Copy Markdown
Contributor

@bettercallzaal bettercallzaal commented Apr 15, 2026

Summary

  • Adds createModel() helper (lib/ai/createModel.ts) that resolves model strings through whichever provider is configured
  • Priority cascade: Vercel AI Gateway → OpenRouter → direct OpenAI
  • All AI SDK call sites updated to use createModel() (15 files)
  • Adds @openrouter/ai-sdk-provider dependency
  • Adds fallback model list in getAvailableModels() for non-gateway environments
  • Fixes bare "gemini-2.5-pro""google/gemini-2.5-pro" in TOOL_MODEL_MAP

Motivation

Contributors without Vercel AI Gateway access can't run the chat/agent features locally. This lets devs run with just an OPENROUTER_API_KEY (one key, many models) or a direct OPENAI_API_KEY.

No breaking changes — existing production deployments using the gateway are unaffected.

What's NOT changed

  • Direct OpenAI API calls for video/transcription/image generation (provider-specific, can't route through OpenRouter)
  • Type-only imports from "ai" package
  • Tasks submodule (calls API over HTTP, no direct AI SDK usage)

Test plan

  • Set VERCEL_AI_GATEWAY_API_KEY → verify existing behavior unchanged
  • Set only OPENROUTER_API_KEY → verify chat/agents work through OpenRouter
  • Set only OPENAI_API_KEY → verify chat/agents work through OpenAI directly
  • pnpm build passes
  • pnpm test passes

🤖 Generated with Claude Code


Summary by cubic

Add OpenRouter and direct OpenAI as alternative LLM providers via a new createModel() helper, so local dev works without the Vercel AI Gateway. Existing gateway behavior and media generation APIs remain unchanged.

  • New Features

    • createModel(modelId) routes by priority: Vercel AI Gateway → OpenRouter → OpenAI; updated AI SDK call sites to use it.
    • getAvailableModels() returns a small default list when no gateway is configured, with pricing metadata.
    • Added @openrouter/ai-sdk-provider.
  • Bug Fixes

    • Corrected model ID: "gemini-2.5-pro" → "google/gemini-2.5-pro" in tool model mapping.
    • Provider-specific behaviors: image generation agent stays provider-specific; direct OpenAI throws on non-openai/* models.
    • Improved provider detection: includes VERCEL_OIDC_TOKEN.
    • On gateway errors, getAvailableModels() now returns [] instead of the default list.

Written for commit 8f6b5b1. Summary will update on new commits.

Adds a `createModel()` helper that resolves model strings through
whichever provider is configured, with priority:
1. Vercel AI Gateway (existing production behavior)
2. OpenRouter (one key, many models)
3. Direct OpenAI (strip provider prefix)

This lets contributors run the app locally without needing a
Vercel AI Gateway API key — just set OPENROUTER_API_KEY or
OPENAI_API_KEY instead.

All AI SDK call sites updated to use createModel(). No changes to
direct OpenAI API calls (video/transcription/image generation)
which remain provider-specific.

Also fixes bare model string "gemini-2.5-pro" to use the correct
"google/gemini-2.5-pro" provider-prefixed format.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@vercel
Copy link
Copy Markdown
Contributor

vercel Bot commented Apr 15, 2026

@bettercallzaal is attempting to deploy a commit to the Recoupable Team on Vercel.

A member of the Team first needs to authorize it.

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 15, 2026

Warning

Rate limit exceeded

@bettercallzaal has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 23 minutes and 33 seconds before requesting another review.

Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 23 minutes and 33 seconds.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: e103a450-1d3f-4e57-a23a-aa31a5da6c8d

📥 Commits

Reviewing files that changed from the base of the PR and between ec0008b and 8f6b5b1.

⛔ Files ignored due to path filters (4)
  • lib/evals/scorers/CatalogAvailability.ts is excluded by !**/evals/** and included by lib/**
  • lib/evals/scorers/QuestionAnswered.ts is excluded by !**/evals/** and included by lib/**
  • package.json is excluded by none and included by none
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml and included by none
📒 Files selected for processing (10)
  • lib/agents/CompactAgent/createCompactAgent.ts
  • lib/agents/EmailReplyAgent/createEmailReplyAgent.ts
  • lib/agents/content/createContentPromptAgent.ts
  • lib/agents/generalAgent/getGeneralAgent.ts
  • lib/ai/createModel.ts
  • lib/ai/generateArray.ts
  • lib/ai/generateText.ts
  • lib/ai/getAvailableModels.ts
  • lib/catalog/analyzeCatalogBatch.ts
  • lib/chat/toolChains/toolChains.ts
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

4 issues found across 15 files

Confidence score: 3/5

  • There is concrete regression risk in fallback behavior: createModel("google/gemini-2.5-pro") can be passed directly to OpenAI in lib/chat/toolChains/toolChains.ts, which is likely to fail at runtime with an unknown-model error when only OPENAI_API_KEY is configured.
  • A similarly high-impact path exists in lib/agents/ImageGenerationAgent/createImageGenerationAgent.ts, where the provider-specific google/gemini-3-pro-image can be routed through direct OpenAI and fail for custom image-generation flows.
  • lib/ai/createModel.ts currently allows non-OpenAI model IDs to reach the OpenAI SDK, so errors are confusing and harder to diagnose instead of failing early with a clear message; this keeps risk in the medium range rather than merge-blocking.
  • Pay close attention to lib/chat/toolChains/toolChains.ts, lib/agents/ImageGenerationAgent/createImageGenerationAgent.ts, lib/ai/createModel.ts, lib/ai/getAvailableModels.ts - provider routing and fallback error handling may mask or trigger runtime failures.
Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="lib/chat/toolChains/toolChains.ts">

<violation number="1" location="lib/chat/toolChains/toolChains.ts:22">
P1: When only `OPENAI_API_KEY` is set, `createModel("google/gemini-2.5-pro")` passes `"google/gemini-2.5-pro"` verbatim to the OpenAI SDK, which will reject it as an unknown model. The direct-OpenAI fallback in `createModel` only strips the `openai/` prefix and has no mapping or error for non-OpenAI model IDs.</violation>
</file>

<file name="lib/ai/getAvailableModels.ts">

<violation number="1" location="lib/ai/getAvailableModels.ts:36">
P2: When the gateway key is set but the API call fails, returning `DEFAULT_MODELS` silently masks gateway errors. Consider logging the error and/or returning `[]` to preserve the previous failure-is-visible semantics, so operators notice gateway misconfiguration.</violation>
</file>

<file name="lib/ai/createModel.ts">

<violation number="1" location="lib/ai/createModel.ts:26">
P2: Non-OpenAI model IDs (e.g. `"google/gemini-2.5-pro"`) silently pass through to the OpenAI SDK, producing a confusing API error. Consider throwing an explicit error when the direct-OpenAI fallback receives a model that isn't prefixed with `"openai/"`.</violation>
</file>

<file name="lib/agents/ImageGenerationAgent/createImageGenerationAgent.ts">

<violation number="1" location="lib/agents/ImageGenerationAgent/createImageGenerationAgent.ts:11">
P1: Custom agent: **Flag AI Slop and Fabricated Changes**

This Google-specific image generation model (`google/gemini-3-pro-image`) is provider-dependent and will fail when routed through direct OpenAI (`openai("google/gemini-3-pro-image")`). The PR description itself acknowledges image generation is provider-specific and "can't route through OpenRouter," yet this file was included in the mechanical `createModel()` sweep. Keep the original string literal here, or guard this call site so it only resolves via the gateway.</violation>
</file>
Architecture diagram
sequenceDiagram
    participant Feature as AI Features (Agents, Utilities, Evals)
    participant Factory as NEW: createModel() (lib/ai/createModel.ts)
    participant SDK as AI SDK / Providers
    participant Gateway as Vercel AI Gateway
    participant OpenRouter as NEW: OpenRouter API
    participant OpenAI as NEW: OpenAI API

    Note over Feature, OpenAI: LLM Request Flow (Priority Cascade)

    Feature->>Factory: createModel(modelId)
    
    alt VERCEL_AI_GATEWAY_API_KEY is set
        Factory->>SDK: Use gateway(modelId)
        SDK-->>Factory: LanguageModel (Gateway)
    else OPENROUTER_API_KEY is set
        Factory->>SDK: NEW: createOpenRouter()
        Factory->>SDK: Use openrouter(modelId)
        SDK-->>Factory: LanguageModel (OpenRouter)
    else OPENAI_API_KEY is set
        Factory->>Factory: NEW: Strip "openai/" prefix if present
        Factory->>SDK: Use openai(bareModel)
        SDK-->>Factory: LanguageModel (Direct)
    else No keys configured
        Factory-->>Feature: Throw Error
    end

    Factory-->>Feature: Return LanguageModel

    Feature->>SDK: generateText() / ToolLoopAgent / generateObject()
    
    alt Routing via Gateway
        SDK->>Gateway: POST /v1/chat/completions
        Gateway-->>SDK: Response
    else Routing via OpenRouter
        SDK->>OpenRouter: NEW: POST /api/v1/chat/completions
        OpenRouter-->>SDK: Response
    else Routing via Direct OpenAI
        SDK->>OpenAI: NEW: POST /v1/chat/completions
        OpenAI-->>SDK: Response
    end

    SDK-->>Feature: Final AI Output (Text/Object/Tool Call)

    Note over Feature, Gateway: Model Discovery Flow
    Feature->>SDK: CHANGED: getAvailableModels()
    alt VERCEL_AI_GATEWAY_API_KEY is set
        SDK->>Gateway: Fetch available models
        Gateway-->>Feature: Model list
    else Gateway not configured
        SDK-->>Feature: NEW: Return hardcoded DEFAULT_MODELS list
    end
Loading

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review, or fix all with cubic.

web_deep_research: "openai/gpt-5.4-mini",
create_knowledge_base: "openai/gpt-5.4-mini",
send_email: "openai/gpt-5.4-mini",
update_account_info: createModel("google/gemini-2.5-pro"),
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai Bot Apr 15, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: When only OPENAI_API_KEY is set, createModel("google/gemini-2.5-pro") passes "google/gemini-2.5-pro" verbatim to the OpenAI SDK, which will reject it as an unknown model. The direct-OpenAI fallback in createModel only strips the openai/ prefix and has no mapping or error for non-OpenAI model IDs.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At lib/chat/toolChains/toolChains.ts, line 22:

<comment>When only `OPENAI_API_KEY` is set, `createModel("google/gemini-2.5-pro")` passes `"google/gemini-2.5-pro"` verbatim to the OpenAI SDK, which will reject it as an unknown model. The direct-OpenAI fallback in `createModel` only strips the `openai/` prefix and has no mapping or error for non-OpenAI model IDs.</comment>

<file context>
@@ -18,22 +19,22 @@ export type PrepareStepResult = {
-  web_deep_research: "openai/gpt-5.4-mini",
-  create_knowledge_base: "openai/gpt-5.4-mini",
-  send_email: "openai/gpt-5.4-mini",
+  update_account_info: createModel("google/gemini-2.5-pro"),
+  get_spotify_search: createModel("openai/gpt-5.4-mini"),
+  update_artist_socials: createModel("openai/gpt-5.4-mini"),
</file context>
Fix with Cubic

Comment thread lib/agents/ImageGenerationAgent/createImageGenerationAgent.ts Outdated
Comment thread lib/ai/getAvailableModels.ts Outdated
Comment thread lib/ai/createModel.ts
- Revert createImageGenerationAgent to use bare model string since
  google/gemini-3-pro-image is provider-specific and can't route
  through OpenRouter or direct OpenAI
- Add explicit error when direct OpenAI fallback receives a non-OpenAI
  model (e.g. "google/..." or "anthropic/...") instead of silently
  passing it to the OpenAI API

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

0 issues found across 2 files (changes from recent commits).

Requires human review: Auto-approval blocked by 2 unresolved issues from previous reviews.

- Add VERCEL_OIDC_TOKEN check to gateway detection in createModel
- Return [] on gateway failure instead of DEFAULT_MODELS to avoid
  masking gateway errors
- Add pricing metadata to fallback model entries

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

0 issues found across 2 files (changes from recent commits).

Requires human review: Auto-approval blocked by 1 unresolved issue from previous reviews.

Comment thread lib/ai/createModel.ts
Comment on lines +19 to +22
if (process.env.OPENROUTER_API_KEY) {
const openrouter = createOpenRouter({ apiKey: process.env.OPENROUTER_API_KEY });
return openrouter(modelId);
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

KISS principle

  • Why add OpenRouter when we already support AI Gateway?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants