Error5 reports
Fix NotFoundError
in LiteLLM
✅ Solution
The "NotFoundError" in LiteLLM often arises from an incorrectly specified model name or an issue with the provider configuration. Double-check that the `model` parameter matches exactly what's expected by the provider (including any provider prefixes like "bedrock/"). Verify your provider's authentication and configuration are set correctly using `litellm.api_key` or appropriate environment variables as described in the LiteLLM documentation for the specific provider.
Related Issues
Real GitHub issues where developers encountered this error:
I can't get even a simple connection to a locally running vllm to workJan 8, 2026
[Bug]: vertex-ai model call failsJan 7, 2026
[Bedrock] Application Inference Profile ARNs fail with "Unknown provider" unless using bedrock/converse/ route explicitlyDec 19, 2025
[Bug]: vLLM Batch/Files API returns 404 - LiteLLM attempts to call non-existent upstream /v1/files endpointDec 18, 2025
[Bug]: Files API does not work with vLLM providerDec 15, 2025
Timeline
First reported:Dec 15, 2025
Last reported:Jan 8, 2026