GHSA-RP42-5VXX-QPWR
Vulnerability from github – Published: 2026-04-16 21:37 – Updated: 2026-04-16 21:37Summary
basic-ftp@5.2.2 is vulnerable to denial of service through unbounded memory growth while processing directory listings from a remote FTP server. A malicious or compromised server can send an extremely large or never-ending listing response to Client.list(), causing the client process to consume memory until it becomes unstable or crashes.
Details
The issue is in the package's default directory listing flow.
Client.list() reaches dist/Client.js, where the full listing response is downloaded into a StringWriter before parsing:
File: dist/Client.js:516-527
async _requestListWithCommand(command) {
const buffer = new StringWriter_1.StringWriter();
await (0, transfer_1.downloadTo)(buffer, {
ftp: this.ftp,
tracker: this._progressTracker,
command,
remotePath: "",
type: "list"
});
const text = buffer.getText(this.ftp.encoding);
this.ftp.log(text);
return this.parseList(text);
}
The vulnerable sink is StringWriter, which grows an in-memory Buffer with no limit:
File: dist/StringWriter.js:5-20
class StringWriter extends stream_1.Writable {
constructor() {
super(...arguments);
this.buf = Buffer.alloc(0);
}
_write(chunk, _, callback) {
if (chunk instanceof Buffer) {
this.buf = Buffer.concat([this.buf, chunk]);
callback(null);
}
else {
callback(new Error("StringWriter expects chunks of type 'Buffer'."));
}
}
getText(encoding) {
return this.buf.toString(encoding);
}
}
The critical operation is:
this.buf = Buffer.concat([this.buf, chunk]);
There is no maximum size check, no truncation, and no streaming parser. Because the remote FTP server controls the listing response, it can force the client to keep allocating memory until the process is terminated.
How it happens:
- An application connects to an attacker-controlled or compromised FTP server.
- The application calls
client.list(). - The server returns an extremely large or unbounded directory listing.
basic-ftpbuffers the full response inStringWriter.- Memory grows without bound due to repeated
Buffer.concat(...)calls.
PoC
The following PoC exercises the vulnerable buffering primitive directly:
const { StringWriter } = require("basic-ftp/dist/StringWriter.js");
function mb(n) {
return Math.round(n / 1024 / 1024) + "MB";
}
const writer = new StringWriter();
let wrote = 0;
for (let i = 0; i < 32; i++) {
const chunk = Buffer.alloc(4 * 1024 * 1024, 0x41);
writer.write(chunk);
wrote += chunk.length;
if ((i + 1) % 8 === 0) {
const m = process.memoryUsage();
console.log("written", mb(wrote), "rss", mb(m.rss), "heap", mb(m.heapUsed), "buf", mb(m.arrayBuffers));
}
}
console.log("final text len", writer.getText("utf8").length);
Observed output:
written 32MB rss 116MB heap 4MB buf 64MB
written 64MB rss 296MB heap 4MB buf 240MB
written 96MB rss 340MB heap 3MB buf 284MB
written 128MB rss 436MB heap 3MB buf 376MB
final text len 134217728
This demonstrates sustained memory growth in the same code path used to buffer directory listing data.
Supporting files saved alongside this report:
poc.jspoc_output.txt
Impact
This is a denial-of-service vulnerability affecting applications that use basic-ftp to list directories from remote FTP servers.
- Vulnerability class: Memory exhaustion / Denial of Service
- Attack precondition: The victim connects to a malicious or compromised FTP server and performs
Client.list() - Impacted users: Any application or service using
basic-ftp@5.2.2against untrusted FTP endpoints - Security effect: The attacker can cause excessive memory consumption, process instability, and potential process termination
Recommended remediation:
- Enforce a maximum listing size.
- Abort transfers that exceed the configured limit.
- Prefer incremental or streaming parsing over full-response buffering.
Example defensive check:
if (this.buf.length + chunk.length > MAX_LISTING_BYTES) {
callback(new Error("FTP listing exceeds maximum allowed size."));
return;
}
this.buf = Buffer.concat([this.buf, chunk]);
{
"affected": [
{
"database_specific": {
"last_known_affected_version_range": "\u003c= 5.2.2"
},
"package": {
"ecosystem": "npm",
"name": "basic-ftp"
},
"ranges": [
{
"events": [
{
"introduced": "0"
},
{
"fixed": "5.3.0"
}
],
"type": "ECOSYSTEM"
}
]
}
],
"aliases": [],
"database_specific": {
"cwe_ids": [
"CWE-400",
"CWE-770"
],
"github_reviewed": true,
"github_reviewed_at": "2026-04-16T21:37:48Z",
"nvd_published_at": null,
"severity": "HIGH"
},
"details": "### Summary\n`basic-ftp@5.2.2` is vulnerable to denial of service through unbounded memory growth while processing directory listings from a remote FTP server. A malicious or compromised server can send an extremely large or never-ending listing response to `Client.list()`, causing the client process to consume memory until it becomes unstable or crashes.\n\n### Details\nThe issue is in the package\u0027s default directory listing flow.\n\n`Client.list()` reaches `dist/Client.js`, where the full listing response is downloaded into a `StringWriter` before parsing:\n\nFile: `dist/Client.js:516-527`\n\n```js\nasync _requestListWithCommand(command) {\n const buffer = new StringWriter_1.StringWriter();\n await (0, transfer_1.downloadTo)(buffer, {\n ftp: this.ftp,\n tracker: this._progressTracker,\n command,\n remotePath: \"\",\n type: \"list\"\n });\n const text = buffer.getText(this.ftp.encoding);\n this.ftp.log(text);\n return this.parseList(text);\n}\n```\n\nThe vulnerable sink is `StringWriter`, which grows an in-memory `Buffer` with no limit:\n\nFile: `dist/StringWriter.js:5-20`\n\n```js\nclass StringWriter extends stream_1.Writable {\n constructor() {\n super(...arguments);\n this.buf = Buffer.alloc(0);\n }\n _write(chunk, _, callback) {\n if (chunk instanceof Buffer) {\n this.buf = Buffer.concat([this.buf, chunk]);\n callback(null);\n }\n else {\n callback(new Error(\"StringWriter expects chunks of type \u0027Buffer\u0027.\"));\n }\n }\n getText(encoding) {\n return this.buf.toString(encoding);\n }\n}\n```\n\nThe critical operation is:\n\n```js\nthis.buf = Buffer.concat([this.buf, chunk]);\n```\n\nThere is no maximum size check, no truncation, and no streaming parser. Because the remote FTP server controls the listing response, it can force the client to keep allocating memory until the process is terminated.\n\nHow it happens:\n\n1. An application connects to an attacker-controlled or compromised FTP server.\n2. The application calls `client.list()`.\n3. The server returns an extremely large or unbounded directory listing.\n4. `basic-ftp` buffers the full response in `StringWriter`.\n5. Memory grows without bound due to repeated `Buffer.concat(...)` calls.\n\n### PoC\nThe following PoC exercises the vulnerable buffering primitive directly:\n\n```js\nconst { StringWriter } = require(\"basic-ftp/dist/StringWriter.js\");\n\nfunction mb(n) {\n return Math.round(n / 1024 / 1024) + \"MB\";\n}\n\nconst writer = new StringWriter();\nlet wrote = 0;\n\nfor (let i = 0; i \u003c 32; i++) {\n const chunk = Buffer.alloc(4 * 1024 * 1024, 0x41);\n writer.write(chunk);\n wrote += chunk.length;\n\n if ((i + 1) % 8 === 0) {\n const m = process.memoryUsage();\n console.log(\"written\", mb(wrote), \"rss\", mb(m.rss), \"heap\", mb(m.heapUsed), \"buf\", mb(m.arrayBuffers));\n }\n}\n\nconsole.log(\"final text len\", writer.getText(\"utf8\").length);\n```\n\nObserved output:\n\n```text\nwritten 32MB rss 116MB heap 4MB buf 64MB\nwritten 64MB rss 296MB heap 4MB buf 240MB\nwritten 96MB rss 340MB heap 3MB buf 284MB\nwritten 128MB rss 436MB heap 3MB buf 376MB\nfinal text len 134217728\n```\n\nThis demonstrates sustained memory growth in the same code path used to buffer directory listing data.\n\nSupporting files saved alongside this report:\n\n- `poc.js`\n- `poc_output.txt`\n\n### Impact\nThis is a denial-of-service vulnerability affecting applications that use `basic-ftp` to list directories from remote FTP servers.\n\n- Vulnerability class: Memory exhaustion / Denial of Service\n- Attack precondition: The victim connects to a malicious or compromised FTP server and performs `Client.list()`\n- Impacted users: Any application or service using `basic-ftp@5.2.2` against untrusted FTP endpoints\n- Security effect: The attacker can cause excessive memory consumption, process instability, and potential process termination\n\nRecommended remediation:\n\n1. Enforce a maximum listing size.\n2. Abort transfers that exceed the configured limit.\n3. Prefer incremental or streaming parsing over full-response buffering.\n\nExample defensive check:\n\n```js\nif (this.buf.length + chunk.length \u003e MAX_LISTING_BYTES) {\n callback(new Error(\"FTP listing exceeds maximum allowed size.\"));\n return;\n}\nthis.buf = Buffer.concat([this.buf, chunk]);\n```",
"id": "GHSA-rp42-5vxx-qpwr",
"modified": "2026-04-16T21:37:48Z",
"published": "2026-04-16T21:37:48Z",
"references": [
{
"type": "WEB",
"url": "https://github.com/patrickjuchli/basic-ftp/security/advisories/GHSA-rp42-5vxx-qpwr"
},
{
"type": "PACKAGE",
"url": "https://github.com/patrickjuchli/basic-ftp"
}
],
"schema_version": "1.4.0",
"severity": [
{
"score": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H",
"type": "CVSS_V3"
}
],
"summary": "basic-ftp vulnerable to denial of service via unbounded memory consumption in Client.list()"
}
Sightings
| Author | Source | Type | Date |
|---|
Nomenclature
- Seen: The vulnerability was mentioned, discussed, or observed by the user.
- Confirmed: The vulnerability has been validated from an analyst's perspective.
- Published Proof of Concept: A public proof of concept is available for this vulnerability.
- Exploited: The vulnerability was observed as exploited by the user who reported the sighting.
- Patched: The vulnerability was observed as successfully patched by the user who reported the sighting.
- Not exploited: The vulnerability was not observed as exploited by the user who reported the sighting.
- Not confirmed: The user expressed doubt about the validity of the vulnerability.
- Not patched: The vulnerability was not observed as successfully patched by the user who reported the sighting.