Skip to content

responses-code-interpreter#13

Open
ashish-dalal wants to merge 1 commit intoLlamaEdge:devfrom
ashish-dalal:feat-code-interpreter
Open

responses-code-interpreter#13
ashish-dalal wants to merge 1 commit intoLlamaEdge:devfrom
ashish-dalal:feat-code-interpreter

Conversation

@ashish-dalal
Copy link
Contributor

Summary

  • Docker-based code interpreter for /responses, and new [code_interpreter] config
  • Execute python blocks inside sandboxed containers and then stdout/stderr as tool output
  • Persist interpreter metadata with sessions, multi-turn conversations can reuse the container

Closes #11
Related to WasmEdge/WasmEdge#4374

@apepkuss
Copy link
Collaborator

"Execute python blocks inside sandboxed containers ..." Does this statement mean that it's required to install Docker in the running environment (Linux/macOS)?

@ashish-dalal
Copy link
Contributor Author

@apepkuss, yes the Docker is a requirement for the interpreter to work, also the docker daemon needs to be running as well. Please let me know if you have some better alternatives over docker.

@ashish-dalal
Copy link
Contributor Author

ashish-dalal commented Dec 14, 2025

Would you like to suggest wasm runtimes? I had more experience with dockers hence I used it during the implementation. please let me know if we need to change it

let host_config = HostConfig {
memory: Some((self.settings.memory_limit_mb * 1024 * 1024) as i64),
nano_cpus: Some(((self.settings.cpu_percent * 1_000_000_000) / 100) as i64),
network_mode: Some("none".to_string()),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

network_mode sets to "none", however, the code snippet between line 195 and 205 would invoke pip install ... command to install some packages?


# Optional sandboxed code interpreter for responses API
[code_interpreter]
enable = true # Enable/disable Docker-based interpreter
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shoule set enable = false as the code_interpreter configuration is optional.

Comment on lines +223 to +274
async fn run_exec(
&self,
container_id: &str,
cmd: Vec<String>,
timeout_window: Duration,
) -> Result<ExecutionResult, CodeInterpreterError> {
let exec = self
.docker
.create_exec(
container_id,
CreateExecOptions {
attach_stdout: Some(true),
attach_stderr: Some(true),
cmd: Some(cmd),
..Default::default()
},
)
.await
.map_err(|err| CodeInterpreterError::Execution(err.to_string()))?;

let stream = self
.docker
.start_exec(
&exec.id,
Some(StartExecOptions {
detach: false,
..Default::default()
}),
)
.await
.map_err(|err| CodeInterpreterError::Execution(err.to_string()))?;

let collected = timeout(timeout_window, Self::collect_output(stream)).await;
let (stdout, stderr) = match collected {
Ok(result) => result?,
Err(_) => return Err(CodeInterpreterError::Timeout),
};

let inspect = self
.docker
.inspect_exec(&exec.id)
.await
.map_err(|err| CodeInterpreterError::Execution(err.to_string()))?;

let exit_code = inspect.exit_code.unwrap_or_default();

Ok(ExecutionResult {
stdout,
stderr,
exit_code,
})
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If code execution times out, besides returning a Timeout, will the running process be terminated or the container destroyed? If not, this could leave suspended executions still consuming CPU and memory until they are reclaimed by the idle_timeout.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants