Skip to content

Conversation

@stephentoub
Copy link
Member

@stephentoub stephentoub commented Oct 24, 2025

Microsoft Reviewers: Open in CodeFlow

@stephentoub stephentoub requested a review from a team as a code owner October 24, 2025 19:42
@stephentoub stephentoub added the area-ai Microsoft.Extensions.AI libraries label Oct 24, 2025
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR allows ChatOptions.ConversationId to accept OpenAI conversation IDs (prefixed with "conv_") in addition to response IDs (prefixed with "resp_") when using the Responses API. The implementation differentiates between conversation IDs and response IDs based on the "conv_" prefix and routes them to the appropriate OpenAI API fields.

Key Changes:

  • ChatOptions.ConversationId can now accept both "conv_" prefixed conversation IDs and "resp_" prefixed response IDs
  • Conversation IDs are routed to the conversation field in the OpenAI API request
  • Response IDs are routed to the previous_response_id field as before
  • The ConversationId is now tracked separately from the response ID to preserve the original value in responses

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 1 comment.

File Description
OpenAIResponsesChatClient.cs Added logic to detect and handle conversation IDs vs response IDs, including JSON manipulation to set the conversation field when needed
MicrosoftExtensionsAIResponsesExtensions.cs Updated extension method signatures to pass null for conversationId parameter
OpenAIResponseClientTests.cs Added comprehensive test coverage for conversation ID scenarios in both streaming and non-streaming modes

@rogerbarreto
Copy link
Contributor

rogerbarreto commented Oct 28, 2025

I understand that relying on the conv_ and resp_ naming is enough for OpenAI pattern but this may be fragile if used in custom scenarios.

What do you think of giving this new OpenAI Conversation behavior decision responsibility to the caller, defaulting the Old pattern to false. e.g:

openAIResponseClient.AsIChatClient(useConversationIdAsPreviousResponse: false);

Note

An extra benefit of this approach is that all the internal conversation implementation logic is isolated with a simple flag check compared to actual logic with extra complexity checking the same field for different behaviors.

@stephentoub
Copy link
Member Author

stephentoub commented Oct 28, 2025

What do you think of giving this new OpenAI Conversation behavior decision responsibility to the caller, defaulting the Old pattern to false.

It's possible we'll want to do something like that eventually, but it also complicates the model and puts this very front-and-center. I can also imagine other ways we'd want to expose this, ranging from an options bag that's passed to AsIChatClient to a delegate that maps from ChatOptions to the underlying raw options type (like RawRepresentationFactory but after configuration rather than before). RawRepresentationFactory is also a valid escape hatch already, as you can configure it however you like and this logic respects previously set values.

An extra benefit of this approach is that all the internal conversation implementation logic is isolated with a simple flag check compared to actual logic with extra complexity checking the same field for different behaviors.

Not really. If it were just a flag check, the logic in this PR would be as simple. Most of the complication in this PR stems instead from wanting to respect anything set on ResponseCreationOptions in RawRepresentationFactory, combined with Conversation not being exposed yet on ResponseCreationOptions.

@stephentoub stephentoub requested a review from jozkee October 29, 2025 15:24
Copy link
Member

@jozkee jozkee left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot address the unresolved feedback please.

@jozkee
Copy link
Member

jozkee commented Oct 30, 2025

image

What's "cross repository" about this PR? I was hoping it would work as it did here: modelcontextprotocol/csharp-sdk#892 (comment).

@stephentoub stephentoub merged commit 44f6484 into dotnet:main Oct 30, 2025
6 checks passed
@stephentoub stephentoub deleted the conversationid branch October 30, 2025 21:26
jeffhandley pushed a commit to jeffhandley/extensions that referenced this pull request Nov 1, 2025
… Responses (dotnet#6960)

* Allow ChatOptions.ConversationId to be an OpenAI conversation ID with Responses

* Update src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs

Co-authored-by: Copilot <[email protected]>

---------

Co-authored-by: Copilot <[email protected]>
jeffhandley pushed a commit that referenced this pull request Nov 2, 2025
… Responses (#6960)

* Allow ChatOptions.ConversationId to be an OpenAI conversation ID with Responses

* Update src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs

Co-authored-by: Copilot <[email protected]>

---------

Co-authored-by: Copilot <[email protected]>
This was referenced Nov 24, 2025
renebentes pushed a commit to renebentes/3054 that referenced this pull request Nov 24, 2025
… 10.0.0 (#73)

Updated
[Microsoft.Extensions.Http.Resilience](https://github.com/dotnet/extensions)
from 9.10.0 to 10.0.0.

<details>
<summary>Release notes</summary>

_Sourced from [Microsoft.Extensions.Http.Resilience's
releases](https://github.com/dotnet/extensions/releases)._

## 10.0.0

## What's Changed
* Give FunctionInvokingChatClient span a more OTELy name by @​verdie-g
in dotnet/extensions#6911
* Update repository branding from 9.10 to 10.0 by @​Copilot in
dotnet/extensions#6907
* Clean up local function names in AIFunctionFactory by @​Copilot in
dotnet/extensions#6909
* Update OpenTelemetryChatClient to output data on all tools by
@​stephentoub in dotnet/extensions#6906
* Update ToChatResponse{Async} to also factor in AuthorName by
@​stephentoub in dotnet/extensions#6910
* add support for background responses by @​SergeyMenshykh in
dotnet/extensions#6854
* Fix `METGEN004` error message: print return type in
`ErrorInvalidMethodReturnType` by @​eduherminio in
dotnet/extensions#6905
* Fix OpenTelemetryChatClient failing on unknown content types by
@​stephentoub in dotnet/extensions#6915
* Add support for Connector ID and other follow ups by @​jozkee in
dotnet/extensions#6881
* Update AI lib changelogs by @​stephentoub in
dotnet/extensions#6920
* Merge internal changes by @​joperezr in
dotnet/extensions#6921
* Add Workstream, Stage, and PackageValidationBaselineVersion metadata
to ServiceDiscovery libraries by @​Copilot in
dotnet/extensions#6919
* Add back Uri ctor to HostedMcpServerTool by @​jozkee in
dotnet/extensions#6926
* Set DisableNETStandardCompatErrors in ServiceDiscovery libraries by
@​eerhardt in dotnet/extensions#6927
* Update Package validation baseline version to 9.10.0 by @​Copilot in
dotnet/extensions#6922
* [main] Update dependencies from dotnet/arcade by @​dotnet-maestro[bot]
in dotnet/extensions#6802
* Extend service discovery to support Consul-based DNS lookups: by
@​bart-vmware in dotnet/extensions#6914
* Update AsOpenAIResponseItems to roundtrip User AIContent ResponseItems
by @​stephentoub in dotnet/extensions#6931
* Special-case AIContent returned from AIFunctionFactory.Create
AIFunctions to not be serialized by @​stephentoub in
dotnet/extensions#6935
* Preserve function content in `SummarizingChatReducer` by
@​MackinnonBuck in dotnet/extensions#6908
* Tool reduction by @​MackinnonBuck in
dotnet/extensions#6781
* Fix coalescing of TextReasoningContent with ProtectedData by
@​stephentoub in dotnet/extensions#6936
* Doc updates by @​gewarren in
dotnet/extensions#6930
* Support DisplayNameAttribute for name resolution in AI libraries by
@​Copilot in dotnet/extensions#6942
* Fix EquivalenceEvaluator MaxOutputTokens to meet Azure OpenAI minimum
requirement by @​Copilot in
dotnet/extensions#6948
* Support DefaultValueAttribute in AIFunctionFactory parameter handling
by @​Copilot in dotnet/extensions#6947
* Bump vite from 6.3.6 to 6.4.1 in
/src/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting/TypeScript
by @​dependabot[bot] in dotnet/extensions#6938
* Introduce Microsoft.Extensions.DataIngestion.Abstractions by
@​adamsitnik in dotnet/extensions#6949
* Update to latest schema version (accepted by MCP registry) by
@​joelverhagen in dotnet/extensions#6956
* Introduce IngestionChunkWriter build on top of MEVD by @​adamsitnik in
dotnet/extensions#6951
* Update AI Chat Web dependencies by @​MackinnonBuck in
dotnet/extensions#6955
* Add AITool -> OpenAI.Responses.ResponseTool conversion utility by
@​rogerbarreto in dotnet/extensions#6958
* Update AI changelogs for 9.10.1 by @​stephentoub in
dotnet/extensions#6950
* Add Name property to OtelMessage to store ChatMessage.AuthorName per
OpenTelemetry semantic conventions by @​Copilot in
dotnet/extensions#6953
* Fix serialization of UserInputRequest/ResponseContent by @​stephentoub
in dotnet/extensions#6962
* Expose building blocks for external service discovery implementations
by @​bart-vmware in dotnet/extensions#6946
* Bump validator from 13.15.0 to 13.15.20 in
/src/Libraries/Microsoft.Extensions.AI.Evaluation.Reporting/TypeScript
by @​dependabot[bot] in dotnet/extensions#6974
* Add eng/sdl-tsa-vars.config for TSA integration by @​Copilot in
dotnet/extensions#6980
* Add CodeInterpreterToolCall/ResultContent content types by
@​stephentoub in dotnet/extensions#6964
* Update to 1.38 of the otel genai standard convention by @​stephentoub
in dotnet/extensions#6981
* Introduce set of built-in Enrichers by @​adamsitnik in
dotnet/extensions#6957
* Allow ChatOptions.ConversationId to be an OpenAI conversation ID with
Responses by @​stephentoub in
dotnet/extensions#6960
* Fix warning breaking official build, enable warningAsError in all
pipelines by @​ericstj in dotnet/extensions#6988
* Introduce HeaderChunker by @​adamsitnik in
dotnet/extensions#6979
* Introduce Markdown readers by @​adamsitnik in
dotnet/extensions#6969
* Add usage telemetry for aieval dotnet tool by @​shyamnamboodiripad in
dotnet/extensions#6773
* Update to OpenAI 2.6.0 by @​stephentoub in
dotnet/extensions#6996
* Don't specify MaxOutputTokens for EquivalenceEvaluator by
@​shyamnamboodiripad in dotnet/extensions#7006
* Fix Assert.Throws to validate parameter names by @​stephentoub in
dotnet/extensions#7007
 ... (truncated)

Commits viewable in [compare
view](dotnet/extensions@v9.10.0...v10.0.0).
</details>

[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=Microsoft.Extensions.Http.Resilience&package-manager=nuget&previous-version=9.10.0&new-version=10.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area-ai Microsoft.Extensions.AI libraries

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants