Skip to content

Conversation

@rexagod
Copy link

@rexagod rexagod commented Jan 29, 2026

Description

I wasn't able to make out what was wrong while running LCS, as the unexpected error message wasn't much to go on with. Upon surfacing the actual error, I noticed that the failed event fired since I had hit my token limit, which I believe is useful enough information to be surfaced.

I understand if this wasn't done earlier to prevent sensitive data leakage, in which case, I'll close this, or add a case to surface only when token limit is hit, although that may be tricky since different LLMs report that in different ways.

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up service version
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change
  • Unit tests improvement
  • Integration tests improvement
  • End to end tests improvement

Tools used to create PR

Identify any AI code assistants used in this PR (for transparency and review context)

  • Assisted-by: N/A
  • Generated by: N/A

Checklist before requesting a review

  • I have performed a self-review of my code.
  • PR has passed all pre-merge test jobs.
  • If it is a core feature, I have added thorough tests.

Summary by CodeRabbit

  • Bug Fixes
    • Enhanced error logging for streaming queries to include full failed-response details, improving diagnostics and making issues easier to investigate. This change only augments logged information for troubleshooting and does not alter runtime behavior or error handling observable by users.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 29, 2026

Walkthrough

Added a debug log in the Responses API v2 streaming error branch that logs the full error_message; no control flow, error handling, or exported API changes.

Changes

Cohort / File(s) Summary
Streaming error logging
src/app/endpoints/streaming_query_v2.py
Inserted logger.debug("Full error response: %s", error_message) in the streaming response error branch to emit the complete error payload for debugging. No functional or control-flow changes.

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~3 minutes

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title directly describes the main change: adding debug logging to surface failed event errors in logs for better diagnostics.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@tisnik tisnik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thank you

Copy link
Contributor

@tisnik tisnik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm actually if you add it as debug log, it will be "secret" enough + mergeable
(I guess you use debug logging locally...)

@rexagod
Copy link
Author

rexagod commented Feb 2, 2026

Moved full error response to debug level, PTAL

I wasn't able to make out what was wrong while running LCS, as the
unexpected error message wasn't much to go on with. Upon surfacing the
actual error, I noticed that the failed event fired since I had hit my
token limit, which I believe is useful enough information to be
surfaced.

I understand if this wasn't done earlier to prevent sensitive data
leakage, in which case, I'll close this, or add a case to surface only
when token limit is hit, although that may be tricky since different
LLMs report that in different ways.

Signed-off-by: Pranshu Srivastava <rexagod@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants