fix: always include reasoning.encrypted_content for OpenAI reasoning models
When using a reasoning model (o1, o3, o4-mini, etc.) without explicitly setting reasoningEffort or reasoningSummary, the include param was not set. OpenAI returns a bare rs_... reasoning item ID which gets stored in thinkingSignature and replayed in the next turn. Since store is false, OpenAI cannot find the rs_... item server-side and returns a 404. Fix: move params.include = ["reasoning.encrypted_content"] outside the reasoningEffort/reasoningSummary guard so it is always set for any reasoning model. This ensures the encrypted blob is returned and can be replayed correctly without needing server-side storage.
This commit is contained in:
parent
211520c2d2
commit
c47ee71b2d
1 changed files with 1 additions and 1 deletions
|
|
@ -236,13 +236,13 @@ function buildParams(model: Model<"openai-responses">, context: Context, options
|
|||
}
|
||||
|
||||
if (model.reasoning) {
|
||||
params.include = ["reasoning.encrypted_content"];
|
||||
if (options?.reasoningEffort || options?.reasoningSummary) {
|
||||
const effort = clampReasoningForModel(model.name, options?.reasoningEffort || "medium") as typeof options.reasoningEffort;
|
||||
params.reasoning = {
|
||||
effort: effort || "medium",
|
||||
summary: options?.reasoningSummary || "auto",
|
||||
};
|
||||
params.include = ["reasoning.encrypted_content"];
|
||||
} else {
|
||||
if (model.name.startsWith("gpt-5")) {
|
||||
// Jesus Christ, see https://community.openai.com/t/need-reasoning-false-option-for-gpt-5/1351588/7
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue