Change8

MemGPT

AI & LLMs

Letta is the platform for building stateful agents: open AI with advanced memory that can learn and self-improve over time.

Latest: 0.16.5100 releases5 breaking changes16 common errorsView on GitHub

Release History

0.16.5
Feb 24, 2026

This release bumps the version to 0.16.5. It appears to be a maintenance or chore release.

0.16.41 fix
Jan 29, 2026

This release primarily consists of maintenance updates, including updating GitHub templates and bumping the version from 0.16.2 to 0.16.4.

0.16.2
Jan 12, 2026

This release primarily focuses on documentation updates, including corrections to the README and contributing guides, and bumps the version to 0.16.2.

0.16.11 fix
Dec 18, 2025

This patch release (v0.16.1) primarily fixes an issue with the provider name configuration for openai-proxy in LLMConfig.

0.16.01 fix
Dec 15, 2025

This release bumps the version to v0.16.0 and includes an update to the readme and architecture-specific OTEL installation logic.

0.15.1
Nov 26, 2025

This release bumps the version to 0.15.1.

0.15.01 feature
Nov 25, 2025

Version 0.15.0 introduces context window support for grok-4 models. This release focuses on expanding model capabilities.

0.14.0
Nov 14, 2025

This release bumps the version number to 0.14.0.

0.13.01 feature
Oct 24, 2025

Version 0.13.0 introduces Haiku 4.5 as a new reasoning model and includes documentation cleanup.

0.12.1Breaking2 fixes11 features
Oct 9, 2025

This release introduces the major `letta_v1_agent` architecture, offering broader provider compatibility and simpler control flow, alongside new features like Human-in-the-Loop and Parallel Tool Calling. It also deprecates and renames several older API endpoints.

0.12.0
Oct 9, 2025

No release notes provided.

0.11.72 fixes17 features
Sep 4, 2025

This release introduces significant Human-in-the-Loop (HITL) support for tool execution and upgrades the Agent File schema to v2. It also brings substantial improvements to archival memory search capabilities and enhances model support for GPT-5, DeepSeek, and Anthropic.

0.11.6
Aug 27, 2025

No release notes provided.

0.11.51 feature
Aug 26, 2025

This release introduces background mode support for message streaming, enhancing asynchronous operations.

0.11.42 features
Aug 20, 2025

This release deprecates legacy paths for azure and together, introduces an asyncio shield for stream timeouts, and adds step metrics recording.

0.11.31 feature
Aug 12, 2025

Version 0.11.3 primarily involves refactoring by moving dictconfig out of the getlogger function.

0.11.22 fixes1 feature
Aug 8, 2025

This release introduces a new max_steps parameter for agent export and fixes two bugs related to the Ollama provider, specifically concerning the embeddings endpoint URL and model type returns.

0.11.1Breaking1 fix2 features
Aug 8, 2025

This release introduces support for new LLM models like Claude Opus 4.1 and GPT-5, and enhances built-in tools by improving memory reliability and paginating the file grep tool.

0.11.0Breaking1 fix2 features
Aug 7, 2025

This release fully removes legacy clients, introduces Signoz tracing integration, optimizes Jinja templating performance, and raises the minimum required Python version to 3.11.

0.10.0Breaking5 fixes8 features
Aug 1, 2025

This release introduces the `LettaPing` message for stable long streaming connections and adds support for OAuth MCP providers. It also defaults agents to the new `memgpt_v2_agent` architecture and includes various performance improvements and bug fixes.

0.9.11 fix1 feature
Jul 27, 2025

This release focuses on bug fixes for the filesystem feature and introduces asynchronous rendering for jinja templates in core routes, alongside adding an agent tag reverse index.

0.9.03 features
Jul 24, 2025

Version 0.9.0 introduces the Letta Filesystem for enhanced document management and context control, along with new options for document parsing via Mistral OCR.

0.8.17
Jul 21, 2025

This release bumps the version to v0.8.17, primarily consisting of maintenance chores.

0.8.16
Jul 21, 2025

This release bumps the version to v0.8.16, primarily consisting of maintenance chores.

0.8.15
Jul 15, 2025

This release bumps the version to v0.8.15, primarily consisting of maintenance chores.

0.8.14
Jul 14, 2025

This release bumps the version number to 0.8.14.

0.8.13
Jul 10, 2025

This release bumps the version number from 0.8.12 to 0.8.13.

0.8.12
Jul 9, 2025

This release bumps the version to 0.8.12 and adds tracing capabilities to the polling mechanism.

0.8.11
Jul 8, 2025

This release bumps the internal letta version to 0.8.11.

0.8.10
Jul 7, 2025

This release (0.8.10) is a maintenance update, likely containing minor fixes or internal changes, as indicated by the 'chore' tag.

0.8.91 fix4 features
Jul 3, 2025

This release introduces support for different summarization providers, agent loop cancellation, and MCP custom headers, alongside various bug fixes.

0.8.83 fixes2 features
Jun 29, 2025

This release introduces new Feedback APIs for steps and includes several bug fixes, such as improvements to Anthropic streaming and agent timezone updates.

0.8.71 fix3 features
Jun 27, 2025

This release introduces new react and workflow agents, adds step feedback, and includes fixes for batch agents. Some existing tools have also been deprecated.

0.8.61 fix1 feature
Jun 25, 2025

This release introduces custom timezone support and fixes a bug in the bedrock integration.

0.8.5
Jun 19, 2025

No release notes provided.

0.8.42 fixes2 features
Jun 15, 2025

This release introduces streamable HTTP support and authenticated MCP support, alongside fixes for OpenAI o-series models and Anthropic streaming.

0.8.32 fixes
Jun 7, 2025

This release includes fixes for issues found in the stdio and sse MCP connectors.

0.8.2
Jun 6, 2025

No release notes provided.

0.8.1
Jun 6, 2025

No release notes provided.

0.8.02 fixes
Jun 4, 2025

This release includes fixes for Ollama integration and token counting logic, alongside the official deprecation of legacy clients.

0.7.292 features
Jun 1, 2025

Version 0.7.29 introduces new configuration options for batch size and lookback settings.

0.7.281 feature
May 28, 2025

Version 0.7.28 introduces performance improvements. This release primarily focuses on internal optimizations.

0.7.271 fix
May 25, 2025

This release bumps the version to 0.7.27 and incorporates various bugfixes.

0.7.261 fix
May 23, 2025

This release primarily focuses on an asynchronous fix for batch sandbox creation.

0.7.25
May 23, 2025

This release bumps the version number from 0.7.24 to 0.7.25.

0.7.241 fix
May 23, 2025

This minor release primarily addresses a bug fix related to database initialization.

0.7.23
May 23, 2025

This release bumps the version number from 0.7.22 to 0.7.23 as a routine maintenance chore.

0.7.22
May 23, 2025

This release bumps the version number from 0.7.21 to 0.7.22.

0.7.211 feature
May 21, 2025

Version 0.7.21 introduces a minor enhancement by passing the MCP environment variable to the STDIO MCP client.

0.7.201 feature
May 17, 2025

This release introduces Node.js support for node-based MCPs. It primarily focuses on adding new functionality rather than fixing existing issues or introducing breaking changes.

0.7.19
May 16, 2025

This release primarily involves bumping the version number to 0.7.19.

0.7.181 fix
May 16, 2025

Version 0.7.18 addresses an asynchronous size bug and bumps the version number.

0.7.17
May 16, 2025

This release primarily consists of maintenance chores, including updating the version to v0.7.17 and adjusting CI configurations for integration tests.

0.7.162 fixes
May 15, 2025

Version 0.7.16 primarily addresses two specific bugs concerning Anthropic streaming and SleepTimeMultiAgent tool usage.

0.7.151 fix
May 13, 2025

Version 0.7.15 primarily addresses a bug in the Docker Compose configuration related to the external database setup.

0.7.14
May 13, 2025

This release (0.7.14) is a maintenance chore release, likely containing minor internal updates or dependency bumps.

0.7.13
May 10, 2025

This release (0.7.13) is a maintenance chore release, likely containing minor internal updates or dependency bumps.

0.7.12
May 8, 2025

This release primarily bumps the version number to 0.7.12.

0.7.11
May 7, 2025

This release bumps the version to v0.7.11.

0.7.102 fixes
May 6, 2025

This is a bugfix release addressing issues found in Gemini Vertex integration and summarization capabilities.

0.7.9
May 2, 2025

This release primarily bumps the version number to 0.7.9.

0.7.81 fix
May 1, 2025

This release primarily contains a bug fix ensuring in-context messages are correctly trimmed to the cutoff. The version has been bumped to 0.7.8.

0.7.7
Apr 30, 2025

No release notes provided.

0.7.62 fixes1 feature
Apr 29, 2025

This release includes fixes for Anthropic streaming and Gemini model retries, along with new support for custom tool execution environments.

0.7.52 fixes1 feature
Apr 25, 2025

This release introduces new API endpoints for resource counting and addresses critical bugs related to agent deletion race conditions and Azure context window handling.

0.7.42 fixes1 feature
Apr 24, 2025

This release focuses on stability and correctness, fixing issues related to sleeptime input locking and LLM payload parameter passing, alongside an enhancement for local tool execution environments.

0.7.3
Apr 24, 2025

This release (0.7.3) is a maintenance chore release, likely containing minor internal updates or dependency bumps.

0.7.2
Apr 23, 2025

This release primarily consists of version bumps (0.7.1 to 0.7.2) managed through chore updates.

0.7.11 fix2 features
Apr 23, 2025

This release introduces support for reasoning tokens with Gemini Flash and fixes an issue with message filtering in sqlite3 conversation searches. It also includes the addition of a database compose file.

0.7.0
Apr 21, 2025

This release primarily consists of maintenance chores, including updating the issue template and releasing version 0.7.0.

0.6.54
Apr 19, 2025

This release (0.6.54) is a maintenance chore update, likely containing minor fixes or internal changes.

0.6.53
Apr 13, 2025

No release notes provided.

0.6.52
Apr 12, 2025

This release primarily consists of documentation updates and a version bump to 0.6.52.

0.6.51
Apr 11, 2025

This release (0.6.51) is a maintenance chore release, likely containing minor internal updates or dependency bumps.

0.6.502 fixes
Apr 9, 2025

This release includes a fix for setting sequence IDs in sqlite3 using event listeners and addresses issues found in summarization for Google models.

0.6.49
Apr 8, 2025

This release (0.6.49) is a maintenance chore release, likely containing minor internal updates or dependency bumps.

0.6.481 fix
Mar 31, 2025

This release includes a bug fix addressing incorrect parsing of system messages when using the Anthropic provider.

0.6.47
Mar 31, 2025

This release (0.6.47) appears to be a maintenance or chore release, likely containing minor internal updates or dependency bumps.

0.6.461 feature
Mar 30, 2025

Version 0.6.46 introduces an enhancement to conversation search functionality to include agent messages. This release also marks the first contribution from a new community member.

0.6.45
Mar 27, 2025

This release (0.6.45) is a maintenance chore release, likely containing minor updates or dependency bumps.

0.6.44Breaking7 fixes5 features
Mar 25, 2025

This release introduces logging for bedrock, updates to use AzureOpenAI for listings/completions, and addresses several API inconsistencies and bug fixes, including respecting configured embedding models.

0.6.432 fixes
Mar 18, 2025

Version 0.6.43 primarily focuses on bug fixes, including ensuring the parts list is never empty in a dictionary conversion and cleaning up optional dependencies.

0.6.421 fix1 feature
Mar 17, 2025

This release focuses on improving stability by fixing uvicorn worker issues and enhancing the logging system.

0.6.411 feature
Mar 14, 2025

This release integrates the OpenTelemetry collector directly into the letta image for enhanced observability.

0.6.402 features
Mar 14, 2025

This release includes fixes for archival statistics and Uvicorn environment variables, alongside updates to testing infrastructure.

0.6.393 fixes1 feature
Mar 13, 2025

This release introduces MCP support and includes several bug fixes, notably addressing issues with directory loading and Claude empty chunk warnings.

0.6.371 fix1 feature
Mar 6, 2025

This release includes various fixes and removes the length limitation for agent names.

0.6.36
Mar 5, 2025

This release bumps the version to 0.6.36.

0.6.35
Mar 3, 2025

This release bumps the version number from 0.6.34 to 0.6.35.

0.6.341 feature
Feb 27, 2025

This minor release bumps the version and increases the maximum token limit available in the system.

0.6.332 fixes2 features
Feb 26, 2025

This release introduces new type resolution capabilities and partial support for Claude 3 Sonnet, alongside fixes for Pydantic schema generation and argument handling.

0.6.32
Feb 22, 2025

This release primarily consists of dependency updates and a version bump to 0.6.31.

0.6.31
Feb 22, 2025

No release notes provided.

0.6.302 fixes
Feb 21, 2025

This release bumps the version to 0.6.30 and includes fixes related to making OpenTelemetry dependencies required and ensuring SQLite compatibility by changing jsonb to json.

0.6.291 feature
Feb 21, 2025

Version 0.6.29 introduces user identity features and updates the header for identity retrieval requests.

0.6.281 fix2 features
Feb 20, 2025

This release introduces support for the gpt-4o-mini context length mapping and integrates new providers, bedrock and deepseek, alongside various bug fixes.

0.6.271 fix1 feature
Feb 15, 2025

This release fixes an issue with VLLM usage and enhances the reliability of Vertex AI models by setting the connection mode to ANY.

0.6.262 features
Feb 14, 2025

This release introduces an example notebook for tool rules and applies a patch to the Google Vertex integration, bumping the overall version.

0.6.251 feature
Feb 13, 2025

This release introduces support for Vertex AI integration. The full details are available in the comparison link.

0.6.243 fixes1 feature
Feb 12, 2025

This release focuses on bug fixes across error handling, Google embeddings, and CLI operations, alongside improvements to the Anthropic integration.

Common Errors

ModuleNotFoundError6 reports

The "ModuleNotFoundError" in MemGPT typically indicates a missing Python package required by the application. To fix it, identify the missing module (e.g., asyncpg, mcp) and install it using pip: `pip install <missing_module>`. Ensure you're installing packages within the correct environment (e.g., venv) to avoid conflicts.

BadRequestError3 reports

BadRequestError in memgpt usually arises from malformed API requests to the LLM provider, such as exceeding token limits, incorrect formatting of the request body, or providing invalid arguments to a function call. To fix this, carefully inspect the request being sent to the LLM, ensure it adheres to the provider's API specifications (including token limits and data types), and validate that tool call arguments are correctly formatted and complete before submission. If using a proxy, confirm it's correctly configured and forwarding requests without modifications that cause errors.

NameError2 reports

NameError usually arises when a variable or function is used without being defined in the current scope. To fix this, ensure the variable or function is defined (imported or declared) before its usage. Alternatively, check for typos or incorrect capitalization in the variable or function name and correct them to match the definition.

HandleNotFoundError2 reports

The "HandleNotFoundError" in MemGPT often arises when the requested resource (like a specific LLM model) isn't available in the current provider's supported options or hasn't been correctly configured. To resolve this, verify that the model name in your `config.json` or agent creation parameters is supported by your chosen provider (e.g., OpenAI, Azure), and that you've properly configured required API keys or authentication details for that provider. Update your configuration with a valid model name and working credentials, or switch to a provider that supports your desired model.

AttributeError2 reports

AttributeError usually indicates that you're trying to access a non-existent attribute or method of an object. This often happens due to typos in attribute names, incorrect object types, or outdated library versions. To fix it, double-check the attribute name and object type, ensure the library is up to date, and verify the attribute exists in the object's class definition or parent classes.

UniqueViolationError2 reports

UniqueViolationError usually arises when attempting to insert data with a primary key or unique constraint that already exists in the database. To fix it, either update the existing record instead of inserting a new one if the data represents an update, or ensure your application logic prevents duplicate insertions by checking for existing records before creating new ones. Consider using `upsert` functionality provided by your database ORM if updating behavior is intended; otherwise, implement a `get_or_create` or similar pattern.

Related AI & LLMs Packages

Subscribe to Updates

Get notified when new versions are released

RSS Feed