ghsa-j3wr-m6xh-64hg
Vulnerability from github
Published
2025-03-20 12:32
Modified
2025-03-21 17:40
Summary
LlamaIndex Improper Handling of Exceptional Conditions vulnerability
Details

A vulnerability in the LangChainLLM class of the run-llama/llama_index repository, version v0.12.5, allows for a Denial of Service (DoS) attack. The stream_complete method executes the llm using a thread and retrieves the result via the get_response_gen method of the StreamingGeneratorCallbackHandler class. If the thread terminates abnormally before the _llm.predict is executed, there is no exception handling for this case, leading to an infinite loop in the get_response_gen function. This can be triggered by providing an input of an incorrect type, causing the thread to terminate and the process to continue running indefinitely.

Show details on source website


{
  "affected": [
    {
      "package": {
        "ecosystem": "PyPI",
        "name": "llama_index"
      },
      "ranges": [
        {
          "events": [
            {
              "introduced": "0"
            },
            {
              "fixed": "0.12.6"
            }
          ],
          "type": "ECOSYSTEM"
        }
      ]
    }
  ],
  "aliases": [
    "CVE-2024-12704"
  ],
  "database_specific": {
    "cwe_ids": [
      "CWE-755"
    ],
    "github_reviewed": true,
    "github_reviewed_at": "2025-03-21T17:40:52Z",
    "nvd_published_at": "2025-03-20T10:15:29Z",
    "severity": "HIGH"
  },
  "details": "A vulnerability in the LangChainLLM class of the run-llama/llama_index repository, version v0.12.5, allows for a Denial of Service (DoS) attack. The stream_complete method executes the llm using a thread and retrieves the result via the get_response_gen method of the StreamingGeneratorCallbackHandler class. If the thread terminates abnormally before the _llm.predict is executed, there is no exception handling for this case, leading to an infinite loop in the get_response_gen function. This can be triggered by providing an input of an incorrect type, causing the thread to terminate and the process to continue running indefinitely.",
  "id": "GHSA-j3wr-m6xh-64hg",
  "modified": "2025-03-21T17:40:52Z",
  "published": "2025-03-20T12:32:43Z",
  "references": [
    {
      "type": "ADVISORY",
      "url": "https://nvd.nist.gov/vuln/detail/CVE-2024-12704"
    },
    {
      "type": "WEB",
      "url": "https://github.com/run-llama/llama_index/commit/d1ecfb77578d089cbe66728f18f635c09aa32a05"
    },
    {
      "type": "PACKAGE",
      "url": "https://github.com/run-llama/llama_index"
    },
    {
      "type": "WEB",
      "url": "https://huntr.com/bounties/a0b638fd-21c6-4ba7-b381-6ab98472a02a"
    }
  ],
  "schema_version": "1.4.0",
  "severity": [
    {
      "score": "CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H",
      "type": "CVSS_V3"
    }
  ],
  "summary": "LlamaIndex Improper Handling of Exceptional Conditions vulnerability"
}


Log in or create an account to share your comment.




Tags
Taxonomy of the tags.


Loading...

Loading...

Loading...
  • Seen: The vulnerability was mentioned, discussed, or seen somewhere by the user.
  • Confirmed: The vulnerability is confirmed from an analyst perspective.
  • Exploited: This vulnerability was exploited and seen by the user reporting the sighting.
  • Patched: This vulnerability was successfully patched by the user reporting the sighting.
  • Not exploited: This vulnerability was not exploited or seen by the user reporting the sighting.
  • Not confirmed: The user expresses doubt about the veracity of the vulnerability.
  • Not patched: This vulnerability was not successfully patched by the user reporting the sighting.