GHSA-2C2J-9GV5-CJ73
Vulnerability from github – Published: 2025-07-21 19:34 – Updated: 2025-07-21 22:21Summary
When parsing a multi-part form with large files (greater than the default max spool size) starlette will block the main thread to roll the file over to disk. This blocks the event thread which means we can't accept new connections.
Details
Please see this discussion for details: https://github.com/encode/starlette/discussions/2927#discussioncomment-13721403. In summary the following UploadFile code (copied from here) has a minor bug. Instead of just checking for self._in_memory we should also check if the additional bytes will cause a rollover.
@property
def _in_memory(self) -> bool:
# check for SpooledTemporaryFile._rolled
rolled_to_disk = getattr(self.file, "_rolled", True)
return not rolled_to_disk
async def write(self, data: bytes) -> None:
if self.size is not None:
self.size += len(data)
if self._in_memory:
self.file.write(data)
else:
await run_in_threadpool(self.file.write, data)
I have already created a PR which fixes the problem: https://github.com/encode/starlette/pull/2962
PoC
See the discussion here for steps on how to reproduce.
Impact
To be honest, very low and not many users will be impacted. Parsing large forms is already CPU intensive so the additional IO block doesn't slow down starlette that much on systems with modern HDDs/SSDs. If someone is running on tape they might see a greater impact.
{
"affected": [
{
"package": {
"ecosystem": "PyPI",
"name": "starlette"
},
"ranges": [
{
"events": [
{
"introduced": "0"
},
{
"fixed": "0.47.2"
}
],
"type": "ECOSYSTEM"
}
]
}
],
"aliases": [
"CVE-2025-54121"
],
"database_specific": {
"cwe_ids": [
"CWE-770"
],
"github_reviewed": true,
"github_reviewed_at": "2025-07-21T19:34:23Z",
"nvd_published_at": "2025-07-21T20:15:41Z",
"severity": "MODERATE"
},
"details": "### Summary\nWhen parsing a multi-part form with large files (greater than the [default max spool size](https://github.com/encode/starlette/blob/fa5355442753f794965ae1af0f87f9fec1b9a3de/starlette/formparsers.py#L126)) `starlette` will block the main thread to roll the file over to disk. This blocks the event thread which means we can\u0027t accept new connections.\n\n### Details\nPlease see this discussion for details: https://github.com/encode/starlette/discussions/2927#discussioncomment-13721403. In summary the following UploadFile code (copied from [here](https://github.com/encode/starlette/blob/fa5355442753f794965ae1af0f87f9fec1b9a3de/starlette/datastructures.py#L436C5-L447C14)) has a minor bug. Instead of just checking for `self._in_memory` we should also check if the additional bytes will cause a rollover.\n\n```python\n\n @property\n def _in_memory(self) -\u003e bool:\n # check for SpooledTemporaryFile._rolled\n rolled_to_disk = getattr(self.file, \"_rolled\", True)\n return not rolled_to_disk\n\n async def write(self, data: bytes) -\u003e None:\n if self.size is not None:\n self.size += len(data)\n\n if self._in_memory:\n self.file.write(data)\n else:\n await run_in_threadpool(self.file.write, data)\n```\n\nI have already created a PR which fixes the problem: https://github.com/encode/starlette/pull/2962\n\n\n### PoC\nSee the discussion [here](https://github.com/encode/starlette/discussions/2927#discussioncomment-13721403) for steps on how to reproduce.\n\n### Impact\nTo be honest, very low and not many users will be impacted. Parsing large forms is already CPU intensive so the additional IO block doesn\u0027t slow down `starlette` that much on systems with modern HDDs/SSDs. If someone is running on tape they might see a greater impact.",
"id": "GHSA-2c2j-9gv5-cj73",
"modified": "2025-07-21T22:21:05Z",
"published": "2025-07-21T19:34:23Z",
"references": [
{
"type": "WEB",
"url": "https://github.com/encode/starlette/security/advisories/GHSA-2c2j-9gv5-cj73"
},
{
"type": "ADVISORY",
"url": "https://nvd.nist.gov/vuln/detail/CVE-2025-54121"
},
{
"type": "WEB",
"url": "https://github.com/encode/starlette/commit/9f7ec2eb512fcc3fe90b43cb9dd9e1d08696bec1"
},
{
"type": "PACKAGE",
"url": "https://github.com/encode/starlette"
},
{
"type": "WEB",
"url": "https://github.com/encode/starlette/blob/fa5355442753f794965ae1af0f87f9fec1b9a3de/starlette/datastructures.py#L436C5-L447C14"
},
{
"type": "WEB",
"url": "https://github.com/encode/starlette/discussions/2927#discussioncomment-13721403"
}
],
"schema_version": "1.4.0",
"severity": [
{
"score": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L",
"type": "CVSS_V3"
}
],
"summary": "Starlette has possible denial-of-service vector when parsing large files in multipart forms"
}
Sightings
| Author | Source | Type | Date |
|---|
Nomenclature
- Seen: The vulnerability was mentioned, discussed, or observed by the user.
- Confirmed: The vulnerability has been validated from an analyst's perspective.
- Published Proof of Concept: A public proof of concept is available for this vulnerability.
- Exploited: The vulnerability was observed as exploited by the user who reported the sighting.
- Patched: The vulnerability was observed as successfully patched by the user who reported the sighting.
- Not exploited: The vulnerability was not observed as exploited by the user who reported the sighting.
- Not confirmed: The user expressed doubt about the validity of the vulnerability.
- Not patched: The vulnerability was not observed as successfully patched by the user who reported the sighting.