fix(openai): normalize Pydantic objects before JSON serialization in truncate_messages_by_size #5351
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
The OpenAI SDK v1+ returns Pydantic model instances (e.g.,
ResponseFunctionToolCall) which are not directly JSON serializable. When these objects are passed totruncate_messages_by_size(), thejson.dumps()call fails with aTypeError:The
_normalize_data()helper already exists in this file and properly handles Pydantic objects by calling.model_dump(), but it wasn't being used intruncate_messages_by_size().The Fix
This PR adds normalization at the start of
truncate_messages_by_size()to ensure all Pydantic objects are converted to JSON-compatible dicts before serialization:Reproduction
When using the OpenAI SDK's Responses API with tool calls and Sentry's OpenAI integration enabled:
This crashes with the
TypeErrorbecause Sentry's integration tries to serialize the messages for tracing.Fixes #5350