2026-04-06 16:47:02
GitHub_M
PUBLISHED
LiteLLM is a proxy server (AI Gateway) to call LLM APIs in OpenAI (or native) format. Prior to 1.83.0, when JWT authentication is enabled (enable_jwt_auth: true), the OIDC userinfo cache uses token[:20] as the cache key. JWT headers produced by the same signing algorithm generate identical first 20 characters. This configuration option is not enabled by default. Most instances are not affected. An unauthenticated attacker can craft a token whose first 20 characters match a legitimate users cached token. On cache hit, the attacker inherits the legitimate users identity and permissions. This affects deployments with JWT/OIDC authentication enabled. Fixed in v1.83.0.