Multiple critical vulnerabilities have been discovered in vLLM, affecting users who rely on it for large language model inference and serving. These vulnerabilities include remote code execution (RCE), server-side request forgery (SSRF), and denial-of-service (DoS) risks. Users are advised to upgrade to vLLM version 0.18.0 or later to mitigate these risks.
These vulnerabilities range from medium to critical, potentially allowing denial of service or remote code execution.
What is Vllm?
CVE-2026-22773: vLLM DoS Vulnerability in Idefics3 Vision Models
Medium severity: DoS by crashing the engine.
EPSS score of 0.021 indicates a low probability of exploitation.
A denial-of-service vulnerability exists in vLLM's Idefics3 vision model implementation. Sending a specially crafted 1x1 pixel image can cause a tensor dimension mismatch, leading to an unhandled runtime error and server termination.
How to fix CVE-2026-22773 in vLLM
Patch within 24h- 1.Upgrade vLLM to version 0.12.0 or later.
pip install --upgrade vllmWorkaround: Implement input validation to check image dimensions and handle potential runtime errors during image processing.
NextGuard automatically flags CVE-2026-22773 if vLLM appears in any of your monitored projects — no manual lookup required.
CVE-2026-24779: vLLM SSRF Vulnerability through MediaConnector
High severity: SSRF leading to internal network access.
EPSS score of 0.016 indicates a low probability of exploitation.
A Server-Side Request Forgery (SSRF) vulnerability exists in vLLM's `MediaConnector` class. Discrepancies in URL parsing allow attackers to bypass hostname restrictions and coerce the vLLM server into making arbitrary requests to internal network resources.
How to fix CVE-2026-24779 in vLLM
Patch within 24h- 1.Upgrade vLLM to version 0.14.1 or later.
pip install --upgrade vllmNextGuard automatically flags CVE-2026-24779 if vLLM appears in any of your monitored projects — no manual lookup required.
CVE-2026-22778: vLLM RCE Vulnerability in Video Processing
Critical severity: Remote code execution.
EPSS score of 0.084 indicates a moderate probability of exploitation.
A chain of vulnerabilities in vLLM allows for Remote Code Execution (RCE). A PIL error message leaks memory addresses, bypassing ASLR, and a heap overflow in the JPEG2000 decoder in OpenCV/FFmpeg allows hijacking code execution.
How to fix CVE-2026-22778 in vLLM
Patch immediately- 1.Upgrade vLLM to version 0.14.1 or later.
pip install --upgrade vllmWorkaround: Do not serve video models or disable the `video_url` content part in the API.
NextGuard automatically flags CVE-2026-22778 if vLLM appears in any of your monitored projects — no manual lookup required.
CVE-2026-27893: vLLM Hardcoded Trust Override Enables RCE
High severity: RCE despite explicit user opt-out.
EPSS score of 0.033 indicates a low probability of exploitation.
Two model implementation files in vLLM hardcode `trust_remote_code=True`, bypassing the user's explicit `--trust-remote-code=False` security opt-out. This enables remote code execution via malicious model repositories.
How to fix CVE-2026-27893 in vLLM
Patch immediately- 1.Upgrade vLLM to version 0.18.0 or later.
pip install --upgrade vllmNextGuard automatically flags CVE-2026-27893 if vLLM appears in any of your monitored projects — no manual lookup required.
CVE-2026-34760: vLLM Audio Downmix Implementation Differences
Medium severity: Inconsistent audio processing.
EPSS score is not available for this CVE.
A discrepancy exists in vLLM's audio downmixing implementation. Librosa defaults to using numpy.mean for mono downmixing, while the ITU-R BS.775-4 standard specifies a weighted downmixing algorithm, leading to inconsistencies between audio heard by humans and audio processed by AI models.
How to fix CVE-2026-34760 in vLLM
Patch within 7 days- 1.Upgrade vLLM to version 0.18.0 or later.
pip install --upgrade vllmNextGuard automatically flags CVE-2026-34760 if vLLM appears in any of your monitored projects — no manual lookup required.
Stay ahead of Python vulnerabilities
Proactively detect and remediate vulnerabilities in your Python projects. Start to monitor your python dependencies today.
Compare PlansFrequently asked questions
Multiple vulnerabilities have been identified in vLLM, highlighting the importance of keeping your dependencies up to date. Regularly see all python vulnerabilities and apply security patches to protect your systems. Prioritize upgrading to vLLM version 0.18.0 or later to address these critical issues.
Related topics