Skip to content
Open
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 6 additions & 4 deletions packages/opencode/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,6 @@ import { GithubCommand } from "./cli/cmd/github"
import { ExportCommand } from "./cli/cmd/export"
import { AttachCommand } from "./cli/cmd/attach"

const cancel = new AbortController()

process.on("unhandledRejection", (e) => {
Log.Default.error("rejection", {
e: e instanceof Error ? e.message : e,
Expand Down Expand Up @@ -129,6 +127,10 @@ try {
if (formatted) UI.error(formatted)
if (formatted === undefined) UI.error("Unexpected error, check log file at " + Log.file() + " for more details")
process.exitCode = 1
} finally {
// Some subprocesses don't react properly to SIGTERM and similar signals.
// Most notably, some docker-container-based MCP servers don't handle such signals unless
// run using `docker run --init`.
// Explicitly exit to avoid any hanging subprocesses.
process.exit();
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are there any other cases besides the docker mcps?

Can't we just track the pid from transports and send a SIGKILL if they are still running post close?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have a shell script wrapping an stdio MCP that has the same behavior. I can send it to you later today. I can imagine there can be other cases where an MCP misbehaves in a similar way. I have seen a few MCP server implementations and the general vibe is not high quality or well thought out.

I would like to avoid using SIGKILL. Fact of the matter is, the actual MCP server process receives SIGTERM when process.exit() is called. I confirmed that by running strace on the container's PID. But from the POV of OC, the child process is not the MCP server, but rather the docker client managing the MCP server.

The PID from the transports is an implementation detail - I would like to avoid using that too.

Any subprocess not under our control can cause the hang. The process.exit is a panacea, and as long as we do proper awaits, it won't cause any issues. Which we do for the most part.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The shell script + how I use it in opencode.json, as promised: https://gist.github.com/veracioux/3a9fe2ea06fca0d16852481030d9297e

}

cancel.abort()
6 changes: 3 additions & 3 deletions packages/opencode/src/lsp/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -101,9 +101,9 @@ export namespace LSP {
}
},
async (state) => {
for (const client of state.clients) {
await client.shutdown()
}
await Promise.all(
state.clients.map((client) => client.shutdown())
)
},
)

Expand Down
4 changes: 1 addition & 3 deletions packages/opencode/src/mcp/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -145,9 +145,7 @@ export namespace MCP {
}
},
async (state) => {
for (const client of Object.values(state.clients)) {
client.close()
}
await Promise.all(Object.values(state.clients).map(client => client.close()));
},
)

Expand Down
2 changes: 1 addition & 1 deletion packages/opencode/src/project/bootstrap.ts
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ export async function InstanceBootstrap() {
await Plugin.init()
Share.init()
Format.init()
LSP.init()
await LSP.init()
FileWatcher.init()
File.init()
}
8 changes: 4 additions & 4 deletions packages/opencode/src/project/state.ts
Original file line number Diff line number Diff line change
Expand Up @@ -26,9 +26,9 @@ export namespace State {
}

export async function dispose(key: string) {
for (const [_, entry] of entries.get(key)?.entries() ?? []) {
if (!entry.dispose) continue
await entry.dispose(await entry.state)
}
await Promise.all(
(entries.get(key)?.values() ?? [])
.map(async (entry) => await entry.dispose?.(await entry.state)),
)
}
}
8 changes: 6 additions & 2 deletions packages/opencode/src/session/prompt.ts
Original file line number Diff line number Diff line change
Expand Up @@ -76,10 +76,14 @@ export namespace SessionPrompt {

return {
queued,
ensureTitlePromise: undefined as Promise<void> | undefined,
Copy link
Collaborator

@rekram1-node rekram1-node Oct 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what if I have multiple calls to prompt at once?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Keep in mind that opencode server isn't tied just to the normal server + tui, it can also have other use cases ex:

https://github.com/sst/opencode/blob/dev/packages/sdk/js/example/example.ts

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yes, I'll make some tweaks. Thanks for pointing out.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See newest commits. They should address both these concerns.

}
},
async (current) => {
current.queued.clear()
log.info("waiting for session title to be generated")
await current.ensureTitlePromise
log.info("session title awaited")
},
)

Expand Down Expand Up @@ -216,7 +220,7 @@ export namespace SessionPrompt {
(messages) => insertReminders({ messages, agent }),
)
if (step === 0)
ensureTitle({
state().ensureTitlePromise = ensureTitle({
session,
history: msgs,
message: userMsg,
Expand Down Expand Up @@ -1635,7 +1639,7 @@ export namespace SessionPrompt {
thinkingBudget: 0,
}
}
generateText({
await generateText({
maxOutputTokens: small.info.reasoning ? 1500 : 20,
providerOptions: {
[small.providerID]: options,
Expand Down