v0.32.0
📦 huggingface-hubView on GitHub →
✨ 7 features🐛 1 fixes🔧 7 symbols
Summary
This release introduces powerful new capabilities for LLM interaction via the Model Context Protocol (MCP) Client and Tiny Agents CLI, alongside support for new inference providers and enhanced dataclass validation with the @strict decorator.
Migration Steps
- To use the new MCP features, install with the optional dependency: `pip install -U huggingface_hub[mcp]`.
✨ New Features
- Introduction of the MCPClient in `huggingface_hub` to enable LLMs to interact with external Tools via Model Context Protocol (MCP).
- The `MCPClient` extends `InfrenceClient` and supports connecting LLMs to local and remote tool servers.
- Addition of a higher-level `Agent` class (Tiny Agents) to simplify creating conversational Agents by managing chat loop and state, built on top of `MCPClient`.
- New CLI command `tiny-agents run` for executing Agents directly from the command line.
- Support for feature extraction (embeddings) inference with the Nebius provider.
- Introduction of Nscale as an official inference provider.
- New `@strict` decorator for dataclasses to provide robust validation capabilities during initialization and assignment, supporting custom validators and class-wise validation.
🐛 Bug Fixes
- Fixed compatibility issues with structured outputs across providers by ensuring `InferenceClient` adheres to OpenAI API specs for structured output in chat completion.