-
Notifications
You must be signed in to change notification settings - Fork 25
🤖 feat: add backend support for soft-interrupts #767
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
|
@codex review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
aa5e615 to
bbba562
Compare
fix: restore null safety in useAIViewKeybinds call fix: preserve abandonPartial flag through soft interrupt path Changed softInterruptPending from boolean to union type that carries the abandonPartial option through to cleanupStream when the soft interrupt fires at a block boundary.
bbba562 to
114c741
Compare
| if (options?.abandonPartial) { | ||
| log.debug("Abandoning partial for workspace:", workspaceId); | ||
| await this.partialService.deletePartial(workspaceId); | ||
| } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is already done in session.interruptStream
|
@codex review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
|
@codex review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
|
@codex review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
|
@codex review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
|
@codex review |
|
Codex Review: Didn't find any major issues. More of your lovely PRs please. ℹ️ About Codex in GitHubCodex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback". |
Reverts: - 470e4eb perf: fix streaming content delay from ORPC schema validation (#774) - b437977 feat: add backend support for soft-interrupts (#767) - df30cbc fix: use ResultSchema for sendMessage output to prevent field stripping (#773) - 41c77ef fix: testUtils formatting (#771) - 3ee7288 refactor: migrate IPC layer to ORPC for type-safe RPC (#763)
Reverts: - 470e4eb perf: fix streaming content delay from ORPC schema validation (#774) - b437977 feat: add backend support for soft-interrupts (#767) - df30cbc fix: use ResultSchema for sendMessage output to prevent field stripping (#773) - 41c77ef fix: testUtils formatting (#771) - 3ee7288 refactor: migrate IPC layer to ORPC for type-safe RPC (#763) Due to a huge number of regressions. _Generated with `mux`_
Adds a
soft: booleanoption tointerruptStream()that sets a pending flag instead of immediately aborting. The stream manager checks this flag at content boundaries (end of text blocks, tool results, reasoning sections) and gracefully terminates at the next one.This lays the groundwork for automatic context management—when the agent approaches the context window limit, we can issue a soft interrupt to let the current thought complete cleanly before triggering compaction. This avoids cutting off mid-sentence or mid-tool-call, producing cleaner conversation history for the compaction summary.
The frontend currently always uses hard interrupts; the soft interrupt path is exercised only by tests for now. A following PR will use this code.
I have an integration test to add back once the sendMessage integration test suite is restored.....
Generated with
mux