feat: widen openai dependency to support 2.x for litellm compatibility#1793
Open
BV-Venky wants to merge 1 commit intostrands-agents:mainfrom
Open
feat: widen openai dependency to support 2.x for litellm compatibility#1793BV-Venky wants to merge 1 commit intostrands-agents:mainfrom
BV-Venky wants to merge 1 commit intostrands-agents:mainfrom
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
Users who need LiteLLM > 1.80.10 (e.g, for SAP Generative AI Hub support) cannot install
strands-agents[litellm]because LiteLLM's core dependency to use SAP models requiresopenai>=2.8.0, while strands hasopenai<1.110.0for the litellm extra andopenai<2.0.0for the openai/sagemaker extras.This widens the
openaiversion upper bound in three optional dependency groups:litellm:openai>=1.68.0,<1.110.0→openai>=1.68.0,<3.0.0openai:openai>=1.68.0,<2.0.0→openai>=1.68.0,<3.0.0sagemaker:openai>=1.68.0,<2.0.0→openai>=1.68.0,<3.0.0The only breaking change in openai 2.0 (
ResponseFunctionToolCallOutputItem.outputtype change) affects the Responses API, which strands does not use. All existing unit tests pass with openai 2.24.0.Related Issues
Resolves #1672
Type of Change
New feature
Testing
Installed openai 2.24.0 in hatch test environment
All 2000 unit tests pass with openai 2.24.0
Verified LiteLLMModel instantiation works with openai 2.24.0 present
Reviewed openai CHANGELOG from 1.109.0 to 2.24.0; the only breaking change (ResponseFunctionToolCallOutputItem.output type) does not affect strands (uses Chat Completions API, not Responses API)
I ran
hatch run prepareChecklist