Mistral Client
AI & LLMsPython client library for Mistral AI platform
Release History
v2.0.0b1This release was generated by the Speakeasy CLI, updating the Python SDK to version 2.0.0b1 based on the provided OpenAPI specification.
v1.12.4BreakingVersion 1.12.4 introduces numerous breaking changes across the Python SDK, primarily affecting the request and response schemas for beta conversation, agent, chat, FIM, classifier, OCR, and audio transcription endpoints.
v1.12.3Breaking12 featuresThis release introduces numerous breaking changes to request and response schemas across fine-tuning, model management, batch jobs, and beta agent endpoints. It also adds several new fields related to version messages and search/ordering capabilities in beta agent and batch job endpoints.
v1.12.2This release updates the Python SDK to version 1.12.2, generated based on OpenAPI Doc 1.0.0 and Speakeasy CLI 1.685.0 (2.794.1).
v1.12.1This release primarily reflects an update based on OpenAPI Doc 1.0.0 and Speakeasy CLI 1.685.0 (2.794.1), resulting in Python library version v1.12.1.
v1.12.0This release updates the Python SDK to version v1.12.0, generated using the latest Speakeasy tooling.
v1.11.13 featuresThis release introduces new functionality for managing agent versions via mistral.beta.agents and adds a new prompt parameter to the mistral.ocr.process function.
v1.10.1Breaking8 featuresThis release introduces significant breaking changes to the structure of inputs and outputs for several beta conversation endpoints. It also adds metadata support to various endpoints and output fields to batch job operations.
v1.10.0Breaking11 featuresMistral AI SDK v1.10.0 introduces several new features like metadata support and agent deletion, but mandates significant updates due to numerous breaking changes in API response structures and request parameters across various endpoints.
v1.9.11This release updates the Python SDK to version 1.9.11, generated based on OpenAPI Doc 1.0.0 and Speakeasy CLI 1.606.10.
v1.9.10This release (v1.9.10) was generated based on an updated OpenAPI specification and Speakeasy CLI version 1.568.2 (2.634.2).
v1.9.9This release appears to be an automated generation update based on the OpenAPI specification and Speakeasy CLI version 1.568.2 (2.634.2), resulting in Python library version 1.9.9.
v1.9.8This release updates the Python SDK to version 1.9.8, generated based on the latest OpenAPI specification and Speakeasy CLI version 1.568.2.
v1.9.7This release was automatically generated based on an OpenAPI specification update using Speakeasy CLI version 1.568.2 (2.634.2). The Python SDK has been updated to version 1.9.7.
v1.9.6This release (v1.9.6) was generated based on an updated OpenAPI specification and Speakeasy CLI version 1.568.2 (2.634.2).
v1.9.3This release (v1.9.3) was generated based on an updated OpenAPI specification and Speakeasy CLI version 1.568.2 (2.634.2).
v1.9.2This release generated new Python SDK artifacts (v1.9.2) based on updated OpenAPI specifications and Speakeasy CLI tools.
v1.9.1This release (v1.9.1) was automatically generated based on the OpenAPI specification and Speakeasy CLI version 1.568.2 (2.634.2).
v1.8.2This release appears to be an automated generation update based on OpenAPI documentation and the Speakeasy CLI version 1.517.3 (2.548.6), resulting in Python SDK version v1.8.2.
v1.8.1This release appears to be an automated generation update based on OpenAPI documentation and the Speakeasy CLI version 1.517.3 (2.548.6), resulting in Python SDK version v1.8.1.
v1.8.0This release was generated automatically based on an updated OpenAPI specification and Speakeasy CLI version 1.517.3. It corresponds to Python SDK version 1.8.0.
v1.7.1This release was automatically generated based on an updated OpenAPI specification and Speakeasy CLI version 1.517.3 (2.548.6), resulting in Python SDK version 1.7.1.
v1.7.0This release was automatically generated based on an updated OpenAPI specification and Speakeasy CLI version 1.517.3 (2.548.6). The Python SDK has been updated to version 1.7.0.
v1.6.0This release was automatically generated based on an updated OpenAPI specification and Speakeasy CLI version 1.517.3 (2.548.6).
v1.5.2This release was generated automatically based on an updated OpenAPI specification and Speakeasy CLI version 1.477.0 (2.497.0).
v1.5.2-rc.1This release appears to be an automated generation update based on OpenAPI documentation and the Speakeasy CLI version 1.517.3 (2.548.6), resulting in Python library version 1.5.2-rc.1.
v1.5.1This release was automatically generated based on an updated OpenAPI specification and Speakeasy CLI version 1.477.0 (2.497.0).
v1.5.0This release was automatically generated by the Speakeasy CLI version 1.476.2 (2.495.1) based on an OpenAPI document, resulting in Python SDK version 1.5.0.
v1.4.0Breaking1 featureThis release introduces a breaking change by renaming the `bytes` field to `size_bytes` across several file-related output classes and adds a new `prediction` argument.
v1.3.1This release was generated by the Speakeasy CLI, updating the Python SDK to version 1.3.1 based on the latest OpenAPI specification.
v1.3.0This release appears to be an automated generation update based on OpenAPI documentation and the Speakeasy CLI, resulting in Python library version v1.3.0.
Common Errors
ModuleNotFoundError1 reportThe "ModuleNotFoundError" in mistral-client usually means the required dependencies haven't been installed. Fix this by ensuring you've installed the mistral-client package with `pip install mistralai` and that you've activated the correct Python environment if using virtual environments. If problems persist, double-check the spelling of the import statement and that the package is discoverable in your Python path.
SDKError1 reportThe "SDKError" in mistral-client often arises when the error object, potentially containing unhashable types like lists or dictionaries, is used as a key in a dictionary or in a set which requires hashable objects. To fix this, either convert the relevant parts of the error object into hashable types (e.g., tuples instead of lists) or, if the error object itself is being used as a key, extract a unique hashable identifier (like an error code or message string) to use as the key instead. Ensure the corrected code handles potential collisions if using a simple identifier as a key.
Related AI & LLMs Packages
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
Get up and running with OpenAI gpt-oss, DeepSeek-R1, Gemma 3 and other models.
🦜🔗 The platform for reliable agents.
The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
LLM inference in C/C++
GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
Subscribe to Updates
Get notified when new versions are released