Commit 69f8cf9
authored
Fix OpenAI reasoning item error in Responses API (#59)
## Problem
OpenAI Responses API was intermittently failing with error:
```
Item 'rs_00d363a611783a350068e4764c5f68819c8777140c3248eff4' of type 'reasoning' was provided without its required following item.
```
This occurred when using reasoning models (gpt-5, o3, o4-mini) with tool
calls in multi-turn conversations.
## Root Cause
From the [Vercel AI SDK
documentation](https://sdk.vercel.ai/providers/ai-sdk-providers/openai#responses-models):
> When using reasoning models (o1, o3, o4-mini) with multi-step tool
calls and `store: false`, include `['reasoning.encrypted_content']` in
the `include` option to ensure reasoning content is available across
conversation steps.
Even though we're using the default `store: true` and
`previousResponseId` for persistence, we still need to explicitly
include reasoning encrypted content when tool calls are involved. The
reasoning items have IDs (like `rs_*`) that must be properly linked to
their following items.
## Solution
Added `include: ['reasoning.encrypted_content']` to OpenAI provider
options when `reasoningEffort` is configured (meaning reasoning is
enabled).
This ensures reasoning context is properly preserved across multi-turn
conversations with tool calls.
## Changes
- Updated `buildProviderOptions()` in `src/utils/ai/providerOptions.ts`
- Only adds `include` when reasoning is actually enabled
- Added comments explaining why this is required
## Testing
Not adding automated tests as the error was intermittent and difficult
to reliably reproduce. The fix is based directly on OpenAI/Vercel SDK
documentation requirements.1 parent 00f6079 commit 69f8cf9
1 file changed
+4
-0
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
115 | 115 | | |
116 | 116 | | |
117 | 117 | | |
| 118 | + | |
| 119 | + | |
| 120 | + | |
| 121 | + | |
118 | 122 | | |
119 | 123 | | |
120 | 124 | | |
| |||
0 commit comments