Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
73 changes: 69 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ The `config.json` file is located in a standard, platform-specific directory:

#### Provider Types

You can configure `uwu` to use different AI providers by setting the `type` field in your `config.json`. The supported types are `"OpenAI"`, `"Custom"`, `"Claude"`, and `"Gemini"`.
You can configure `uwu` to use different AI providers by setting the `type` field in your `config.json`. The supported types are `"OpenAI"`, `"Custom"`, `"Claude"`, `"Gemini"`, and `"GitHub"`.

Below are examples for each provider type.

Expand Down Expand Up @@ -133,7 +133,21 @@ Uses the native Google Gemini API.

---

##### **4. Custom / Local Models (`type: "Custom"`)**
##### **4. GitHub (`type: "GitHub"`)**
Uses multiple free to use GitHub models.
```json
{
"type": "GitHub",
"apiKey": "your-github-token",
"model": "openai/gpt-4.1-nano"
}
```

- `apiKey`: Your GitHub token.

---

##### **5. Custom / Local Models (`type: "Custom"`)**

This type is for any other OpenAI-compatible API endpoint, such as Ollama, LM Studio, or a third-party proxy service.

Expand Down Expand Up @@ -188,9 +202,11 @@ This type is for any other OpenAI-compatible API endpoint, such as Ollama, LM St

This function lets you type `uwu <description>` and get an editable command preloaded in your shell.


```zsh
# ~/.zshrc

```
```bash
uwu() {
local cmd
cmd="$(uwu-cli "$@")" || return
Expand All @@ -202,9 +218,48 @@ uwu() {

After editing `~/.zshrc`, reload it:

## Running with `llama.cpp` for Local Hosting (Small Models: Gemma-3-4B, SmolLM3-3B-GGUF)

### 1. Configure `uwu` to use `llama.cpp`
```bash
CONFIG_PATH=$(bun -e "import envPaths from 'env-paths'; import path from 'path'; const paths = envPaths('uwu', {suffix: ''}); console.log(path.join(paths.config, 'config.json'));") \
&& mkdir -p "$(dirname "$CONFIG_PATH")" \
&& echo '{"type":"LlamaCpp","model":"gemma-3-4b","contextSize":2048,"temperature":0.1,"maxTokens":150,"port":8080}' > "$CONFIG_PATH" \
&& echo "Configuration set for Gemma-3-4B"
```
You can put more options in llama_cpp.md.

## 2. Update ~/.zshrc to Add Helper Functions
### 2.1 For running llama-cpp to generate and execute function
```bash
uwu() {
local cmd
cmd="$(dist/uwu-cli "$@")" || return
echo "Generated: $cmd"
vared -p "Execute: " -c cmd
print -s -- "$cmd"
eval "$cmd"
}
```
#### 2.1. Stop the llama.cpp server
```bash
source ~/.zshrc
uwu_stop() {
pkill llama-server && echo "Llama server stopped"
}
```
#### 2.2. Direct execution without editing (Not recommended)
```bash
uwu_direct() {
local cmd
cmd="$(dist/uwu-cli "$@")" || return
echo "Executing: $cmd"
eval "$cmd"
}
```


📄 For more details, see llama_cpp.md.

#### bash
```bash
Expand All @@ -228,6 +283,16 @@ uwu generate a new ssh key called uwu-keyand add it to the ssh agent

You'll see the generated command in your shell's input line. Press **Enter** to run it, or edit it first. Executed commands will show up in your shell's history just like any other command.

with llama.cpp

# Interactive mode with editing
```bash
uwu generate a new ssh key called uwu-keyand add it to the ssh agent
```
# kill llama.cpp if you are not using it anymore to stop
```bash
uwu_stop
```
## License

[MIT](LICENSE)
Expand Down
32 changes: 32 additions & 0 deletions bun.lock
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@
"name": "uwu",
"dependencies": {
"@anthropic-ai/sdk": "^0.22.0",
"@azure-rest/ai-inference": "^1.0.0-beta.6",
"@azure/core-auth": "^1.10.0",
"@google/generative-ai": "^0.16.0",
"env-paths": "^3.0.0",
"openai": "^5.12.2",
Expand All @@ -21,6 +23,24 @@
"packages": {
"@anthropic-ai/sdk": ["@anthropic-ai/[email protected]", "", { "dependencies": { "@types/node": "^18.11.18", "@types/node-fetch": "^2.6.4", "abort-controller": "^3.0.0", "agentkeepalive": "^4.2.1", "form-data-encoder": "1.7.2", "formdata-node": "^4.3.2", "node-fetch": "^2.6.7", "web-streams-polyfill": "^3.2.1" } }, "sha512-dv4BCC6FZJw3w66WNLsHlUFjhu19fS1L/5jMPApwhZLa/Oy1j0A2i3RypmDtHEPp4Wwg3aZkSHksp7VzYWjzmw=="],

"@azure-rest/ai-inference": ["@azure-rest/[email protected]", "", { "dependencies": { "@azure-rest/core-client": "^2.1.0", "@azure/abort-controller": "^2.1.2", "@azure/core-auth": "^1.9.0", "@azure/core-lro": "^2.7.2", "@azure/core-rest-pipeline": "^1.18.2", "@azure/core-tracing": "^1.2.0", "@azure/logger": "^1.1.4", "tslib": "^2.8.1" } }, "sha512-j5FrJDTHu2P2+zwFVe5j2edasOIhqkFj+VkDjbhGkQuOoIAByF0egRkgs0G1k03HyJ7bOOT9BkRF7MIgr/afhw=="],

"@azure-rest/core-client": ["@azure-rest/[email protected]", "", { "dependencies": { "@azure/abort-controller": "^2.0.0", "@azure/core-auth": "^1.9.0", "@azure/core-rest-pipeline": "^1.5.0", "@azure/core-tracing": "^1.0.1", "@typespec/ts-http-runtime": "^0.3.0", "tslib": "^2.6.2" } }, "sha512-KMVIPxG6ygcQ1M2hKHahF7eddKejYsWTjoLIfTWiqnaj42dBkYzj4+S8rK9xxmlOaEHKZHcMrRbm0NfN4kgwHw=="],

"@azure/abort-controller": ["@azure/[email protected]", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-nBrLsEWm4J2u5LpAPjxADTlq3trDgVZZXHNKabeXZtpq3d3AbN/KGO82R87rdDz5/lYB024rtEf10/q0urNgsA=="],

"@azure/core-auth": ["@azure/[email protected]", "", { "dependencies": { "@azure/abort-controller": "^2.0.0", "@azure/core-util": "^1.11.0", "tslib": "^2.6.2" } }, "sha512-88Djs5vBvGbHQHf5ZZcaoNHo6Y8BKZkt3cw2iuJIQzLEgH4Ox6Tm4hjFhbqOxyYsgIG/eJbFEHpxRIfEEWv5Ow=="],

"@azure/core-lro": ["@azure/[email protected]", "", { "dependencies": { "@azure/abort-controller": "^2.0.0", "@azure/core-util": "^1.2.0", "@azure/logger": "^1.0.0", "tslib": "^2.6.2" } }, "sha512-0YIpccoX8m/k00O7mDDMdJpbr6mf1yWo2dfmxt5A8XVZVVMz2SSKaEbMCeJRvgQ0IaSlqhjT47p4hVIRRy90xw=="],

"@azure/core-rest-pipeline": ["@azure/[email protected]", "", { "dependencies": { "@azure/abort-controller": "^2.0.0", "@azure/core-auth": "^1.8.0", "@azure/core-tracing": "^1.0.1", "@azure/core-util": "^1.11.0", "@azure/logger": "^1.0.0", "@typespec/ts-http-runtime": "^0.3.0", "tslib": "^2.6.2" } }, "sha512-OKHmb3/Kpm06HypvB3g6Q3zJuvyXcpxDpCS1PnU8OV6AJgSFaee/covXBcPbWc6XDDxtEPlbi3EMQ6nUiPaQtw=="],

"@azure/core-tracing": ["@azure/[email protected]", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-+XvmZLLWPe67WXNZo9Oc9CrPj/Tm8QnHR92fFAFdnbzwNdCH1h+7UdpaQgRSBsMY+oW1kHXNUZQLdZ1gHX3ROw=="],

"@azure/core-util": ["@azure/[email protected]", "", { "dependencies": { "@azure/abort-controller": "^2.0.0", "@typespec/ts-http-runtime": "^0.3.0", "tslib": "^2.6.2" } }, "sha512-o0psW8QWQ58fq3i24Q1K2XfS/jYTxr7O1HRcyUE9bV9NttLU+kYOH82Ixj8DGlMTOWgxm1Sss2QAfKK5UkSPxw=="],

"@azure/logger": ["@azure/[email protected]", "", { "dependencies": { "@typespec/ts-http-runtime": "^0.3.0", "tslib": "^2.6.2" } }, "sha512-fCqPIfOcLE+CGqGPd66c8bZpwAji98tZ4JI9i/mlTNTlsIWslCfpg48s/ypyLxZTump5sypjrKn2/kY7q8oAbA=="],

"@google/generative-ai": ["@google/[email protected]", "", {}, "sha512-t4x4g0z/HT2BdBNfK2ua2xA/Az+SDFng4PxWjgiys/qxbh2YcrCI2rZg9/6eBkd4Iz41yjpCCDOWxsMryLJ7TA=="],

"@types/bun": ["@types/[email protected]", "", { "dependencies": { "bun-types": "1.2.20" } }, "sha512-dX3RGzQ8+KgmMw7CsW4xT5ITBSCrSbfHc36SNT31EOUg/LA9JWq0VDdEXDRSe1InVWpd2yLUM1FUF/kEOyTzYA=="],
Expand All @@ -31,8 +51,12 @@

"@types/react": ["@types/[email protected]", "", { "dependencies": { "csstype": "^3.0.2" } }, "sha512-WmdoynAX8Stew/36uTSVMcLJJ1KRh6L3IZRx1PZ7qJtBqT3dYTgyDTx8H1qoRghErydW7xw9mSJ3wS//tCRpFA=="],

"@typespec/ts-http-runtime": ["@typespec/[email protected]", "", { "dependencies": { "http-proxy-agent": "^7.0.0", "https-proxy-agent": "^7.0.0", "tslib": "^2.6.2" } }, "sha512-sOx1PKSuFwnIl7z4RN0Ls7N9AQawmR9r66eI5rFCzLDIs8HTIYrIpH9QjYWoX0lkgGrkLxXhi4QnK7MizPRrIg=="],

"abort-controller": ["[email protected]", "", { "dependencies": { "event-target-shim": "^5.0.0" } }, "sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg=="],

"agent-base": ["[email protected]", "", {}, "sha512-MnA+YT8fwfJPgBx3m60MNqakm30XOkyIoH1y6huTQvC0PwZG7ki8NacLBcrPbNoo8vEZy7Jpuk7+jMO+CUovTQ=="],

"agentkeepalive": ["[email protected]", "", { "dependencies": { "humanize-ms": "^1.2.1" } }, "sha512-kja8j7PjmncONqaTsB8fQ+wE2mSU2DJ9D4XKoJ5PFWIdRMa6SLSN1ff4mOr4jCbfRSsxR4keIiySJU0N9T5hIQ=="],

"asynckit": ["[email protected]", "", {}, "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q=="],
Expand All @@ -45,6 +69,8 @@

"csstype": ["[email protected]", "", {}, "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw=="],

"debug": ["[email protected]", "", { "dependencies": { "ms": "^2.1.3" } }, "sha512-KcKCqiftBJcZr++7ykoDIEwSa3XWowTfNPo92BYxjXiyYEVrUQh2aLyhxBCwww+heortUFxEJYcRzosstTEBYQ=="],

"delayed-stream": ["[email protected]", "", {}, "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ=="],

"dunder-proto": ["[email protected]", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.1", "es-errors": "^1.3.0", "gopd": "^1.2.0" } }, "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A=="],
Expand Down Expand Up @@ -81,6 +107,10 @@

"hasown": ["[email protected]", "", { "dependencies": { "function-bind": "^1.1.2" } }, "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ=="],

"http-proxy-agent": ["[email protected]", "", { "dependencies": { "agent-base": "^7.1.0", "debug": "^4.3.4" } }, "sha512-T1gkAiYYDWYx3V5Bmyu7HcfcvL7mUrTWiM6yOfa3PIphViJ/gFPbvidQ+veqSOHci/PxBcDabeUNCzpOODJZig=="],

"https-proxy-agent": ["[email protected]", "", { "dependencies": { "agent-base": "^7.1.2", "debug": "4" } }, "sha512-vK9P5/iUfdl95AI+JVyUuIcVtd4ofvtrOr3HNtM2yxC9bnMbEdp3x01OhQNnjb8IJYi38VlTE3mBXwcfvywuSw=="],

"humanize-ms": ["[email protected]", "", { "dependencies": { "ms": "^2.0.0" } }, "sha512-Fl70vYtsAFb/C06PTS9dZBo7ihau+Tu/DNCk/OyHhea07S+aeMWpFFkUaXRa8fI+ScZbEI8dfSxwY7gxZ9SAVQ=="],

"math-intrinsics": ["[email protected]", "", {}, "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g=="],
Expand All @@ -99,6 +129,8 @@

"tr46": ["[email protected]", "", {}, "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw=="],

"tslib": ["[email protected]", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],

"typescript": ["[email protected]", "", { "bin": { "tsc": "bin/tsc", "tsserver": "bin/tsserver" } }, "sha512-CWBzXQrc/qOkhidw1OzBTQuYRbfyxDXJMVJ1XNwUHGROVmuaeiEm3OslpZ1RV96d7SKKjZKrSJu3+t/xlw3R9A=="],

"undici-types": ["[email protected]", "", {}, "sha512-JlCMO+ehdEIKqlFxk6IfVoAUVmgz7cU7zD/h9XZ0qzeosSHmUJVOzSQvvYSYWXkFXC+IfLKSIffhv0sVZup6pA=="],
Expand Down
13 changes: 13 additions & 0 deletions change_config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# LlamaCpp models - auto-detect path
CONFIG_PATH=$(bun -e "import envPaths from 'env-paths'; import path from 'path'; const paths = envPaths('uwu', {suffix: ''}); console.log(path.join(paths.config, 'config.json'));") && mkdir -p "$(dirname "$CONFIG_PATH")" && echo '{"type":"LlamaCpp","model":"tinyllama-1.1b","contextSize":2048,"temperature":0.1,"maxTokens":150,"port":8080}' > "$CONFIG_PATH" && echo "✅ Set to TinyLlama-1.1B"

CONFIG_PATH=$(bun -e "import envPaths from 'env-paths'; import path from 'path'; const paths = envPaths('uwu', {suffix: ''}); console.log(path.join(paths.config, 'config.json'));") && mkdir -p "$(dirname "$CONFIG_PATH")" && echo '{"type":"LlamaCpp","model":"gemma-3-4b","contextSize":2048,"temperature":0.1,"maxTokens":150,"port":8080}' > "$CONFIG_PATH" && echo "✅ Set to Gemma-3-4B"

CONFIG_PATH=$(bun -e "import envPaths from 'env-paths'; import path from 'path'; const paths = envPaths('uwu', {suffix: ''}); console.log(path.join(paths.config, 'config.json'));") && mkdir -p "$(dirname "$CONFIG_PATH")" && echo '{"type":"LlamaCpp","model":"smollm3-3b","contextSize":2048,"temperature":0.1,"maxTokens":150,"port":8080}' > "$CONFIG_PATH" && echo "✅ Set to SmolLM3-3B"

# Other providers - auto-detect path
CONFIG_PATH=$(bun -e "import envPaths from 'env-paths'; import path from 'path'; const paths = envPaths('uwu', {suffix: ''}); console.log(path.join(paths.config, 'config.json'));") && mkdir -p "$(dirname "$CONFIG_PATH")" && echo '{"type":"OpenAI","model":"gpt-4"}' > "$CONFIG_PATH" && echo "✅ Set to OpenAI GPT-4"

CONFIG_PATH=$(bun -e "import envPaths from 'env-paths'; import path from 'path'; const paths = envPaths('uwu', {suffix: ''}); console.log(path.join(paths.config, 'config.json'));") && mkdir -p "$(dirname "$CONFIG_PATH")" && echo '{"type":"Claude","model":"claude-3-5-sonnet-20241022"}' > "$CONFIG_PATH" && echo "✅ Set to Claude 3.5 Sonnet"

CONFIG_PATH=$(bun -e "import envPaths from 'env-paths'; import path from 'path'; const paths = envPaths('uwu', {suffix: ''}); console.log(path.join(paths.config, 'config.json'));") && mkdir -p "$(dirname "$CONFIG_PATH")" && echo '{"type":"Gemini","model":"gemini-1.5-pro"}' > "$CONFIG_PATH" && echo "✅ Set to Gemini 1.5 Pro"
Loading