Skip to content
catoni0 edited this page Feb 19, 2025 · 19 revisions

DeepShell

Purpose

DeepShell was initially inspired by the lack of support for DeepSeek models in Shell_GPT. Over time, it evolved into something more than just a command-line productivity tool.

Why DeepSeek?

With the latest DeepSeek-R1, despite security concerns over its ability to generate potentially malicious responses, it remains a highly capable model that can run on consumer-grade hardware. This is a significant step toward privacy-oriented AI assistants.

Do we really want to upload our entire hard drive's content just to train proprietary LLMs?

As for security concerns—should we ban books too? An LLM is essentially a massive database that retrieves text based on user prompts.


Capabilities

Chat Mode

In this mode, users can interact with an LLM. DeepShell supports reading files and folders with simple commands like:

  • open /file_path
  • open this folder

If a folder is opened, DeepShell will read every possible file and pass its contents to the LLM, appending the user's prompt with "and".

Example:

Prompt:
open LICENSE and translate it to Chinese

DeepShell will process:
File content: <LICENSE content>, User prompt: translate it to Chinese

Session History:

  • DeepShell retains session history, resubmitting the entire conversation to the server with each request.
  • This means responses may take longer over time.
  • By default, "thoughts" are filtered out, causing a slight delay in rendering.

Code Mode

  • Generates code based on the user’s prompt.
  • Displays only the code, filtering out everything else.

Shell Mode

This mode runs two clients:

  1. Command Generator – Suggests shell commands based on the prompt.
  2. Output Analyzer – Processes and explains terminal output.

Example:

User prompt:
update system packages

Process:

  • The Shell Generator creates the command and places it in the input field for user validation.
  • After pressing Enter, the command executes in the terminal.
  • If it starts with sudo, the program requests a password, storing it only for the session.
  • On exit, the password is overwritten with random garbage (using Python’s secrets module).
  • After execution, the user is asked whether to display the output.
  • If displayed, the Output Analyzer processes it before rendering.

This cycle repeats until the user quits.

Manual Execution

Users can bypass the command generator by starting a prompt with !.

Example:
!sudo ufw status

In chat mode, users can also ask questions about command results.

Note: In shell mode, history for both clients is disabled to prevent LLM confusion.


TUI (Text User Interface)

DeepShell relies on Textual for its interface.

  • Copying Output: Hold the Shift key.
  • Exiting: Type exit or press Ctrl + C.

Piping Support

DeepShell supports piping both input and output.

However, when piping data in, TUI input is disabled.
To work around this, execute commands within chat mode using the ! prefix.

Example:
!cat file.txt

Then, ask follow-up questions about the result.


Project Status & Future Plans

DeepShell is in early development, but we strive to keep each commit stable.

  • Code contributions, suggestions, and bug reports are welcome.
  • Core functionality will always be free and open-source under the GPL-3.0 License.
  • Future plans include premium features and enterprise solutions—stay tuned!

Clone this wiki locally