Skip to content

[WIP] Fix reasoning_content preservation during message coalescing#7408

Closed
Copilot wants to merge 1 commit intomainfrom
copilot/fix-reasoning-content-loss
Closed

[WIP] Fix reasoning_content preservation during message coalescing#7408
Copilot wants to merge 1 commit intomainfrom
copilot/fix-reasoning-content-loss

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Mar 19, 2026

  • Fix 1: Preserve reasoning_content (AdditionalProperties) in streaming path - accumulate reasoning text and add to final synthetic ChatResponseUpdate.AdditionalProperties
  • Fix 2: Preserve reasoning_content in non-streaming path - add to ChatMessage.AdditionalProperties in FromOpenAIChatCompletion
  • Fix 3: Round-trip AdditionalProperties["reasoning_content"] back to OpenAI wire format in ToOpenAIChatMessages for assistant messages
  • Add tests: non-streaming preserves reasoning_content in AdditionalProperties
  • Add tests: streaming preserves reasoning_content in coalesced ChatMessage.AdditionalProperties
  • Add tests: ToOpenAIChatMessages round-trips reasoning_content from AdditionalProperties into the request JSON
Original prompt

Fix: Preserve reasoning_content (AdditionalProperties) during streaming-to-message coalescing and round-trip it back to the OpenAI wire format

Context

GitHub Issue: #7405

When using IChatClient with FunctionInvokingChatClient and a DeepSeek reasoning model (e.g. deepseek-reasoner) via GetStreamingResponseAsync, an HTTP 400 error is thrown as soon as a tool call is involved:

System.ClientModel.ClientResultException: HTTP 400 (invalid_request_error: invalid_request_error)
Missing reasoning_content field in the assistant message at message index 2.

DeepSeek reasoning models require that any assistant message in the conversation history that was originally generated with a reasoning_content field must include that field when the conversation is sent back in subsequent requests.

Root Cause

The problem is a two-part data loss during the tool-calling loop:

  1. OpenAIChatClient.FromOpenAIStreamingChatCompletionAsync: When streaming ChatCompletionUpdate chunks are accumulated into a final ChatMessage/ChatCompletion, any AdditionalProperties present on the streaming chunks (including reasoning_content) are not merged into the resulting ChatMessage.AdditionalProperties. This means the round-tripped message loses all provider-specific metadata.

  2. ToOpenAIChatMessages (or similar serialization method): When converting ChatMessage back to the OpenAI API wire format for subsequent requests, AdditionalProperties on the ChatMessage are not serialized back into the request JSON. So even if they were preserved, they wouldn't make it into the next API call.

Required Changes

  1. In OpenAIChatClient (streaming coalescing): When streaming ChatCompletionUpdate chunks are accumulated into a final ChatMessage, ensure that AdditionalProperties from the streaming response (specifically reasoning_content which DeepSeek delivers as a provider-specific extension) are preserved on the resulting ChatMessage.AdditionalProperties.

  2. In the OpenAI message mapping layer (the code that converts ChatMessage → OpenAI ChatCompletionOptions messages, e.g., ToOpenAIChatMessages or similar): Ensure that AdditionalProperties on ChatMessage objects are serialized back into the outgoing request JSON, so that fields like reasoning_content are round-tripped correctly.

  3. Add tests to verify that reasoning_content (and AdditionalProperties in general) survive the full round-trip: streaming response → coalesced ChatMessage → serialized back to API request.

Key Files to Investigate

  • src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIChatClient.cs — look at FromOpenAIStreamingChatCompletionAsync and ToOpenAIChatMessages
  • src/Libraries/Microsoft.Extensions.AI/ChatCompletion/FunctionInvokingChatClient.cs — understand how it reassembles messages between tool-call rounds
  • Test files in the corresponding test projects

Notes

  • FunctionInvokingChatClient is provider-agnostic and works with the IChatClient abstraction. It already preserves AdditionalProperties if they're present on the ChatMessage it receives. The fix is primarily in the Microsoft.Extensions.AI.OpenAI adapter layer.
  • The reasoning_content field is a DeepSeek-specific extension, but the fix should be general enough to handle any provider-specific AdditionalProperties that need round-tripping.

The following is the prior conversation context from the user's chat exploration (may be truncated):

User: How should this be addressed?

Assistant: I'm waiting for your confirmation to proceed with the deep research investigation into the repository.

User: @copilot Accepted Confirmation: Are you sure?

Assistant: I've kicked off a deep research investigation into the dotnet/extensions repository to trace exactly where the reasoning_content field gets dropped. You can follow its progress here:

data:
    - title: Investigate how FunctionInvokingChatClient handles streaming responses and reassembles assistant messages after tool calls
      repository: dotnet/extensions
      url: https://github.com/dotnet/extensions/tasks/5d657375-33a1-40d6-8355-17021bd27b2d
      state: queued

While that runs, here's my initial analysis of the issue and how it should be addressed:


Root Cause

The problem is a two-part data loss during the tool-calling loop:

  1. FunctionInvokingChatClient performs iterative tool-call rounds. After receiving a streaming response containing tool calls, it coalesces the streaming updates into a ChatMessage and appends it to the conversation history. During this coalescing, provider-specific AdditionalProperties (like DeepSeek's reasoning_content) are not carried forward onto the reconstructed assistant message.

  2. When the conversation (now including that stripped-down assistant message) is sent back for the next round, DeepSeek's API rej...

This pull request was created from Copilot chat.


⌨️ Start Copilot coding agent tasks without leaving your editor — available in VS Code, Visual Studio, JetBrains IDEs and Eclipse.

Microsoft Reviewers: Open in CodeFlow

@github-actions github-actions bot added the area-ai Microsoft.Extensions.AI libraries label Mar 19, 2026
@stephentoub stephentoub deleted the copilot/fix-reasoning-content-loss branch March 19, 2026 13:52
Copilot AI requested a review from stephentoub March 19, 2026 13:52
Copilot stopped work on behalf of stephentoub due to an error March 19, 2026 13:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area-ai Microsoft.Extensions.AI libraries

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants