GHSA-2M67-WJPJ-XHG9
Vulnerability from github – Published: 2026-04-04 04:17 – Updated: 2026-04-08 22:42Summary
Jackson Core 3.x does not consistently enforce StreamReadConstraints.maxDocumentLength. Oversized JSON documents can be accepted without a StreamConstraintsException in multiple parser entry points, which allows configured size limits to be bypassed and weakens denial-of-service protections.
Details
Three code paths where maxDocumentLength is not fully enforced:
1. Blocking parsers skip validation of the final in-memory buffer
Blocking parsers validate only previously processed buffers, not the final in-memory buffer:
ReaderBasedJsonParser.java:255UTF8StreamJsonParser.java:208
Relevant code:
_currInputProcessed += bufSize;
_streamReadConstraints.validateDocumentLength(_currInputProcessed);
This means the check occurs only when a completed buffer is rolled over. If an oversized document is fully contained in the final buffer, parsing can complete without any document-length exception.
2. Async parsers skip validation of the final chunk on end-of-input
Async parsers validate previously processed chunks, but do not validate the final chunk on end-of-input:
NonBlockingByteArrayJsonParser.java:49NonBlockingByteBufferJsonParser.java:57NonBlockingUtf8JsonParserBase.java:75
Relevant code:
_currInputProcessed += _origBufferLen;
_streamReadConstraints.validateDocumentLength(_currInputProcessed);
public void endOfInput() {
_endOfInput = true;
}
endOfInput() marks EOF but does not perform a final validateDocumentLength(...) call, so an oversized last chunk is accepted.
3. DataInput parser path does not enforce maxDocumentLength at all
JsonFactory.java:457
Relevant construction path:
int firstByte = ByteSourceJsonBootstrapper.skipUTF8BOM(input);
return new UTF8DataInputJsonParser(readCtxt, ioCtxt,
readCtxt.getStreamReadFeatures(_streamReadFeatures),
readCtxt.getFormatReadFeatures(_formatReadFeatures),
input, can, firstByte);
UTF8DataInputJsonParser does not call StreamReadConstraints.validateDocumentLength(...), so maxDocumentLength is effectively disabled for createParser(..., DataInput) users.
Note: This issue appears distinct from the recently published nesting-depth and number-length constraint advisories because it affects document-length enforcement.
PoC
Async path reproducer
import java.nio.charset.StandardCharsets;
import tools.jackson.core.JsonParser;
import tools.jackson.core.ObjectReadContext;
import tools.jackson.core.StreamReadConstraints;
import tools.jackson.core.async.ByteArrayFeeder;
import tools.jackson.core.json.JsonFactory;
public class Poc {
public static void main(String[] args) throws Exception {
JsonFactory factory = JsonFactory.builder()
.streamReadConstraints(StreamReadConstraints.builder()
.maxDocumentLength(10L)
.build())
.build();
byte[] doc = "{\"a\":1,\"b\":2}".getBytes(StandardCharsets.UTF_8);
try (JsonParser p = factory.createNonBlockingByteArrayParser(ObjectReadContext.empty())) {
ByteArrayFeeder feeder = (ByteArrayFeeder) p.nonBlockingInputFeeder();
feeder.feedInput(doc, 0, doc.length);
feeder.endOfInput();
while (p.nextToken() != null) { }
}
System.out.println("Parsed successfully");
}
}
- Expected result: Parsing should fail because the configured document-length limit is 10, while the input is longer than 10 bytes.
- Actual result: The document is accepted and parsing completes.
Blocking path reproducer
import java.io.ByteArrayInputStream;
import java.nio.charset.StandardCharsets;
import tools.jackson.core.JsonParser;
import tools.jackson.core.StreamReadConstraints;
import tools.jackson.core.json.JsonFactory;
public class Poc2 {
public static void main(String[] args) throws Exception {
JsonFactory factory = JsonFactory.builder()
.streamReadConstraints(StreamReadConstraints.builder()
.maxDocumentLength(10L)
.build())
.build();
byte[] doc = "{\"a\":1,\"b\":2}".getBytes(StandardCharsets.UTF_8);
try (JsonParser p = factory.createParser(new ByteArrayInputStream(doc))) {
while (p.nextToken() != null) { }
}
System.out.println("Parsed successfully");
}
}
Impact
Applications that rely on maxDocumentLength as a safety control for untrusted JSON can accept oversized inputs without error. In network-facing services this weakens an explicit denial-of-service protection and can increase CPU and memory consumption by allowing larger-than-configured request bodies to be processed.
{
"affected": [
{
"database_specific": {
"last_known_affected_version_range": "\u003c= 3.1.0"
},
"package": {
"ecosystem": "Maven",
"name": "tools.jackson.core:jackson-core"
},
"ranges": [
{
"events": [
{
"introduced": "3.0.0"
},
{
"fixed": "3.1.1"
}
],
"type": "ECOSYSTEM"
}
]
}
],
"aliases": [],
"database_specific": {
"cwe_ids": [
"CWE-770"
],
"github_reviewed": true,
"github_reviewed_at": "2026-04-04T04:17:07Z",
"nvd_published_at": null,
"severity": "HIGH"
},
"details": "## Summary\n\nJackson Core 3.x does not consistently enforce `StreamReadConstraints.maxDocumentLength`. Oversized JSON documents can be accepted without a `StreamConstraintsException` in multiple parser entry points, which allows configured size limits to be bypassed and weakens denial-of-service protections.\n\n## Details\n\nThree code paths where `maxDocumentLength` is not fully enforced:\n\n### 1. Blocking parsers skip validation of the final in-memory buffer\n\nBlocking parsers validate only previously processed buffers, not the final in-memory buffer:\n\n- `ReaderBasedJsonParser.java:255`\n- `UTF8StreamJsonParser.java:208`\n\nRelevant code:\n\n```java\n_currInputProcessed += bufSize;\n_streamReadConstraints.validateDocumentLength(_currInputProcessed);\n```\n\nThis means the check occurs only when a completed buffer is rolled over. If an oversized document is fully contained in the final buffer, parsing can complete without any document-length exception.\n\n### 2. Async parsers skip validation of the final chunk on end-of-input\n\nAsync parsers validate previously processed chunks, but do not validate the final chunk on end-of-input:\n\n- `NonBlockingByteArrayJsonParser.java:49`\n- `NonBlockingByteBufferJsonParser.java:57`\n- `NonBlockingUtf8JsonParserBase.java:75`\n\nRelevant code:\n\n```java\n_currInputProcessed += _origBufferLen;\n_streamReadConstraints.validateDocumentLength(_currInputProcessed);\n\npublic void endOfInput() {\n _endOfInput = true;\n}\n```\n\n`endOfInput()` marks EOF but does not perform a final `validateDocumentLength(...)` call, so an oversized last chunk is accepted.\n\n### 3. DataInput parser path does not enforce `maxDocumentLength` at all\n\n- `JsonFactory.java:457`\n\nRelevant construction path:\n\n```java\nint firstByte = ByteSourceJsonBootstrapper.skipUTF8BOM(input);\nreturn new UTF8DataInputJsonParser(readCtxt, ioCtxt,\n readCtxt.getStreamReadFeatures(_streamReadFeatures),\n readCtxt.getFormatReadFeatures(_formatReadFeatures),\n input, can, firstByte);\n```\n\n`UTF8DataInputJsonParser` does not call `StreamReadConstraints.validateDocumentLength(...)`, so `maxDocumentLength` is effectively disabled for `createParser(..., DataInput)` users.\n\n\u003e **Note:** This issue appears distinct from the recently published nesting-depth and number-length constraint advisories because it affects document-length enforcement.\n\n## PoC\n\n### Async path reproducer\n\n```java\nimport java.nio.charset.StandardCharsets;\nimport tools.jackson.core.JsonParser;\nimport tools.jackson.core.ObjectReadContext;\nimport tools.jackson.core.StreamReadConstraints;\nimport tools.jackson.core.async.ByteArrayFeeder;\nimport tools.jackson.core.json.JsonFactory;\n\npublic class Poc {\n public static void main(String[] args) throws Exception {\n JsonFactory factory = JsonFactory.builder()\n .streamReadConstraints(StreamReadConstraints.builder()\n .maxDocumentLength(10L)\n .build())\n .build();\n\n byte[] doc = \"{\\\"a\\\":1,\\\"b\\\":2}\".getBytes(StandardCharsets.UTF_8);\n\n try (JsonParser p = factory.createNonBlockingByteArrayParser(ObjectReadContext.empty())) {\n ByteArrayFeeder feeder = (ByteArrayFeeder) p.nonBlockingInputFeeder();\n feeder.feedInput(doc, 0, doc.length);\n feeder.endOfInput();\n\n while (p.nextToken() != null) { }\n }\n\n System.out.println(\"Parsed successfully\");\n }\n}\n```\n\n- **Expected result:** Parsing should fail because the configured document-length limit is 10, while the input is longer than 10 bytes.\n- **Actual result:** The document is accepted and parsing completes.\n\n### Blocking path reproducer\n\n```java\nimport java.io.ByteArrayInputStream;\nimport java.nio.charset.StandardCharsets;\nimport tools.jackson.core.JsonParser;\nimport tools.jackson.core.StreamReadConstraints;\nimport tools.jackson.core.json.JsonFactory;\n\npublic class Poc2 {\n public static void main(String[] args) throws Exception {\n JsonFactory factory = JsonFactory.builder()\n .streamReadConstraints(StreamReadConstraints.builder()\n .maxDocumentLength(10L)\n .build())\n .build();\n\n byte[] doc = \"{\\\"a\\\":1,\\\"b\\\":2}\".getBytes(StandardCharsets.UTF_8);\n\n try (JsonParser p = factory.createParser(new ByteArrayInputStream(doc))) {\n while (p.nextToken() != null) { }\n }\n\n System.out.println(\"Parsed successfully\");\n }\n}\n```\n\n## Impact\n\nApplications that rely on `maxDocumentLength` as a safety control for untrusted JSON can accept oversized inputs without error. In network-facing services this weakens an explicit denial-of-service protection and can increase CPU and memory consumption by allowing larger-than-configured request bodies to be processed.",
"id": "GHSA-2m67-wjpj-xhg9",
"modified": "2026-04-08T22:42:15Z",
"published": "2026-04-04T04:17:07Z",
"references": [
{
"type": "WEB",
"url": "https://github.com/FasterXML/jackson-core/security/advisories/GHSA-2m67-wjpj-xhg9"
},
{
"type": "WEB",
"url": "https://github.com/FasterXML/jackson-core/commit/74c9ee255d1534c179bc7d3de48941bf39a9079c"
},
{
"type": "PACKAGE",
"url": "https://github.com/FasterXML/jackson-core"
}
],
"schema_version": "1.4.0",
"severity": [
{
"score": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H",
"type": "CVSS_V3"
}
],
"summary": "Jackson Core: Document length constraint bypass in blocking, async, and DataInput parsers"
}
Sightings
| Author | Source | Type | Date | Other |
|---|
Nomenclature
- Seen: The vulnerability was mentioned, discussed, or observed by the user.
- Confirmed: The vulnerability has been validated from an analyst's perspective.
- Published Proof of Concept: A public proof of concept is available for this vulnerability.
- Exploited: The vulnerability was observed as exploited by the user who reported the sighting.
- Patched: The vulnerability was observed as successfully patched by the user who reported the sighting.
- Not exploited: The vulnerability was not observed as exploited by the user who reported the sighting.
- Not confirmed: The user expressed doubt about the validity of the vulnerability.
- Not patched: The vulnerability was not observed as successfully patched by the user who reported the sighting.