Skip to content

Store finish_reason in LLMResult#397

Draft
Copilot wants to merge 4 commits intomainfrom
copilot/store-finish-reason-in-llmresult
Draft

Store finish_reason in LLMResult#397
Copilot wants to merge 4 commits intomainfrom
copilot/store-finish-reason-in-llmresult

Conversation

Copy link
Copy Markdown

Copilot AI commented Jan 28, 2026

LLM refusals and other completion states were not propagated, forcing users to rely on not result.text which is unreliable and ambiguous.

Changes

  • Added finish_reason field to LLMResult - optional string capturing completion state (stop, length, tool_calls, refusal, etc.)
  • Non-streaming path - captures choice.finish_reason from response and passes to LLMResult
  • Streaming path - extracts finish_reason from final chunk; also fixed potential NameError if stream is empty by initializing choice = None
  • Tests - added coverage for both streaming and non-streaming paths with various finish reasons

Usage

result = await llm.call_single(messages)

# Before: unreliable and ambiguous
if not result.text:
    # Could be empty response OR refusal OR error
    pass

# After: explicit and reliable
if result.finish_reason == "refusal":
    # Definitively a refusal
    pass
elif result.finish_reason == "length":
    # Hit token limit
    pass

Fully backward compatible - field defaults to None.

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • astral.sh
    • Triggering command: /usr/bin/curl curl -LsSf REDACTED (dns block)

If you need me to access, download, or install something from one of these locations, you can either:

Original prompt

This section details on the original issue you should resolve

<issue_title>Store finish reason in LLMResult</issue_title>
<issue_description>We don't store the finish reason, which is fine most times.

However, in the event of a LLM refusal, the refusal isn't propagated, so users can only do not result.text.

We should propagate the finish reason.</issue_description>

Comments on the Issue (you are @copilot in this section)


💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Copilot AI and others added 3 commits January 28, 2026 01:01
Co-authored-by: sidnarayanan <8095275+sidnarayanan@users.noreply.github.com>
Co-authored-by: sidnarayanan <8095275+sidnarayanan@users.noreply.github.com>
Co-authored-by: sidnarayanan <8095275+sidnarayanan@users.noreply.github.com>
Copilot AI changed the title [WIP] Add finish reason propagation in LLMResult Store finish_reason in LLMResult Jan 28, 2026
Copilot AI requested a review from sidnarayanan January 28, 2026 01:13
role = None
reasoning_content = []
used_model = None
choice = None
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems to be an unused local?

Comment on lines +1112 to +1114
finish_reason = (
getattr(choice, "finish_reason", None) if choice else None
)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't we know the type here, do we need this getattr?

)
finish_reason: str | None = Field(
default=None,
description="The reason the model stopped generating tokens (e.g., 'stop', 'length', 'tool_calls', 'refusal').",
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Talk about why it would be None

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Store finish reason in LLMResult

3 participants