fix(openai): include 'user' param for responses API models #17648
+129
−2
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Title
fix(openai): include 'user' param for responses API models
Relevant issues
Fixes #17633
Pre-Submission checklist
tests/litellm/directorymake test-unitType
🐛 Bug Fix
Changes
The
userparam was being ignored for responses API models (e.g.,model="openai/responses/gpt-4.1") because the check inget_supported_openai_params()was failing -"responses/gpt-4.1"doesn't match"gpt-4.1"in the models list.Fix: Take out the
responses/prefix before checking if it's in the supported models list.Tests added: 7 unit/integration tests in
tests/test_litellm/llms/openai/chat/test_openai_gpt_transformation.py