ghsa-2c2j-9gv5-cj73
Vulnerability from github
Published
2025-07-21 19:34
Modified
2025-07-21 22:21
Summary
Starlette has possible denial-of-service vector when parsing large files in multipart forms
Details

Summary

When parsing a multi-part form with large files (greater than the default max spool size) starlette will block the main thread to roll the file over to disk. This blocks the event thread which means we can't accept new connections.

Details

Please see this discussion for details: https://github.com/encode/starlette/discussions/2927#discussioncomment-13721403. In summary the following UploadFile code (copied from here) has a minor bug. Instead of just checking for self._in_memory we should also check if the additional bytes will cause a rollover.

```python

@property
def _in_memory(self) -> bool:
    # check for SpooledTemporaryFile._rolled
    rolled_to_disk = getattr(self.file, "_rolled", True)
    return not rolled_to_disk

async def write(self, data: bytes) -> None:
    if self.size is not None:
        self.size += len(data)

    if self._in_memory:
        self.file.write(data)
    else:
        await run_in_threadpool(self.file.write, data)

```

I have already created a PR which fixes the problem: https://github.com/encode/starlette/pull/2962

PoC

See the discussion here for steps on how to reproduce.

Impact

To be honest, very low and not many users will be impacted. Parsing large forms is already CPU intensive so the additional IO block doesn't slow down starlette that much on systems with modern HDDs/SSDs. If someone is running on tape they might see a greater impact.

Show details on source website


{
  "affected": [
    {
      "package": {
        "ecosystem": "PyPI",
        "name": "starlette"
      },
      "ranges": [
        {
          "events": [
            {
              "introduced": "0"
            },
            {
              "fixed": "0.47.2"
            }
          ],
          "type": "ECOSYSTEM"
        }
      ]
    }
  ],
  "aliases": [
    "CVE-2025-54121"
  ],
  "database_specific": {
    "cwe_ids": [
      "CWE-770"
    ],
    "github_reviewed": true,
    "github_reviewed_at": "2025-07-21T19:34:23Z",
    "nvd_published_at": "2025-07-21T20:15:41Z",
    "severity": "MODERATE"
  },
  "details": "### Summary\nWhen parsing a multi-part form with large files (greater than the [default max spool size](https://github.com/encode/starlette/blob/fa5355442753f794965ae1af0f87f9fec1b9a3de/starlette/formparsers.py#L126)) `starlette` will block the main thread to roll the file over to disk. This blocks the event thread which means we can\u0027t accept new connections.\n\n### Details\nPlease see this discussion for details: https://github.com/encode/starlette/discussions/2927#discussioncomment-13721403. In summary the following UploadFile code (copied from [here](https://github.com/encode/starlette/blob/fa5355442753f794965ae1af0f87f9fec1b9a3de/starlette/datastructures.py#L436C5-L447C14)) has a minor bug. Instead of just checking for `self._in_memory` we should also check if the additional bytes will cause a rollover.\n\n```python\n\n    @property\n    def _in_memory(self) -\u003e bool:\n        # check for SpooledTemporaryFile._rolled\n        rolled_to_disk = getattr(self.file, \"_rolled\", True)\n        return not rolled_to_disk\n\n    async def write(self, data: bytes) -\u003e None:\n        if self.size is not None:\n            self.size += len(data)\n\n        if self._in_memory:\n            self.file.write(data)\n        else:\n            await run_in_threadpool(self.file.write, data)\n```\n\nI have already created a PR which fixes the problem: https://github.com/encode/starlette/pull/2962\n\n\n### PoC\nSee the discussion [here](https://github.com/encode/starlette/discussions/2927#discussioncomment-13721403) for steps on how to reproduce.\n\n### Impact\nTo be honest, very low and not many users will be impacted. Parsing large forms is already CPU intensive so the additional IO block doesn\u0027t slow down `starlette` that much on systems with modern HDDs/SSDs. If someone is running on tape they might see a greater impact.",
  "id": "GHSA-2c2j-9gv5-cj73",
  "modified": "2025-07-21T22:21:05Z",
  "published": "2025-07-21T19:34:23Z",
  "references": [
    {
      "type": "WEB",
      "url": "https://github.com/encode/starlette/security/advisories/GHSA-2c2j-9gv5-cj73"
    },
    {
      "type": "ADVISORY",
      "url": "https://nvd.nist.gov/vuln/detail/CVE-2025-54121"
    },
    {
      "type": "WEB",
      "url": "https://github.com/encode/starlette/commit/9f7ec2eb512fcc3fe90b43cb9dd9e1d08696bec1"
    },
    {
      "type": "PACKAGE",
      "url": "https://github.com/encode/starlette"
    },
    {
      "type": "WEB",
      "url": "https://github.com/encode/starlette/blob/fa5355442753f794965ae1af0f87f9fec1b9a3de/starlette/datastructures.py#L436C5-L447C14"
    },
    {
      "type": "WEB",
      "url": "https://github.com/encode/starlette/discussions/2927#discussioncomment-13721403"
    }
  ],
  "schema_version": "1.4.0",
  "severity": [
    {
      "score": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:L",
      "type": "CVSS_V3"
    }
  ],
  "summary": "Starlette has possible denial-of-service vector when parsing large files in multipart forms"
}


Log in or create an account to share your comment.




Tags
Taxonomy of the tags.


Loading...

Loading...

Loading...
  • Seen: The vulnerability was mentioned, discussed, or seen somewhere by the user.
  • Confirmed: The vulnerability is confirmed from an analyst perspective.
  • Exploited: This vulnerability was exploited and seen by the user reporting the sighting.
  • Patched: This vulnerability was successfully patched by the user reporting the sighting.
  • Not exploited: This vulnerability was not exploited or seen by the user reporting the sighting.
  • Not confirmed: The user expresses doubt about the veracity of the vulnerability.
  • Not patched: This vulnerability was not successfully patched by the user reporting the sighting.