Fix Ollama/Azure custom base URLs not sent in chat-v2 requests#2005
Fix Ollama/Azure custom base URLs not sent in chat-v2 requests#2005Juice805 wants to merge 1 commit intoMCPJam:mainfrom
Conversation
|
Codex usage limits have been reached for code reviews. Please check with the admins of this repo to increase the limits by adding credits. |
✅ Snyk checks have passed. No issues have been found so far.
💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse. |
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
WalkthroughThe chat transport request body in the use-chat-session hook has been extended to include base URLs for Ollama and Azure integrations. These values are obtained from getter functions and passed alongside existing parameters like selected servers and chat session identifiers. The memoization dependency array for the transport instance was correspondingly updated to include these getter functions, ensuring the transport regenerates whenever either base URL source changes. Tip 💬 Introducing Slack Agent: The best way for teams to turn conversations into code.Slack Agent is built on CodeRabbit's deep understanding of your code, so your team can collaborate across the entire SDLC without losing context.
Built for teams:
One agent for your entire SDLC. Right inside Slack. 👉 Get your free trial and get 200 agent minutes per Slack user (a $50 value). Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Review rate limit: 7/8 reviews remaining, refill in 7 minutes and 30 seconds.Comment |
Summary
ollamaBaseUrlandazureBaseUrl, so the server always fell back to the default Ollama URL (http://127.0.0.1:11434/api) regardless of what the user configured in Settings.useChatSession, matching whatuseChat(v1) already does.Test plan
http://localhost:11211/api) in LLM Provider settings