CrewAI
AI & LLMsFramework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
Release History
1.8.03 fixes7 featuresThis release introduces production-ready Flows and Crews architecture, featuring native async agent-to-agent communication and human-in-the-loop feedback mechanisms.
1.7.21 fixThis release focuses on resolving connection issues and updating the API reference documentation.
1.7.16 fixes5 featuresThis release introduces a --no-commit flag for the bump command, implements JSON schema for tool serialization, and provides several bug fixes related to async task execution and platform compatibility.
1.7.07 fixes10 featuresThis release introduces comprehensive native async support across the entire framework, including flows, agents, tools, and memory. It also includes critical bug fixes for OpenTelemetry integration, token store deadlocks, and Anthropic tool support.
1.6.15 fixes1 featureThis release focuses on stability improvements across the LLM and RagTool modules, including fixes for async execution and configuration resetting.
1.6.05 fixes5 featuresThis release introduces streaming support for flows and crews, adds Gemini 3 Pro Preview support, and includes several bug fixes for RAG tools and OpenAI parameter handling.
0.203.21 fixThis is a hotfix release updating the crewAI version from 0.203.1 to 0.203.2.
1.5.03 fixes5 featuresThis release introduces LLM call hooks for CrewAgentExecutor, expands data exposure in Task and Agent outputs, and improves tracing instrumentation and documentation.
1.4.12 fixesThis release focuses on bug fixes for agent iteration handling and LLM provider routing logic.
1.4.0Breaking7 fixes6 featuresThis release introduces first-class MCP support and LLM message interceptor hooks, while refactoring embedding configurations and fixing critical flow state and caching issues.
1.3.0Breaking2 fixes3 featuresThis release focuses on refactoring flow handling and tool documentation, specifically renaming embedding parameters and enhancing the Qdrant and Firecrawl tools.
1.2.12 featuresThis release introduces Datadog integration support and enables apps and MCPs within liteagent, alongside updated documentation for platform tool environment variables.
1.2.0Breaking4 fixesThis release focuses on maintenance by updating default LLM models, improving logging, and refining flow visualization, while also removing the unused aisuite dependency.
1.1.03 fixes3 featuresThis release introduces support for multiple LLM providers in InternalInstructor, adds a mypy plugin base, and improves typing for CrewBase alongside several bug fixes.
1.0.07 fixes3 featuresCrewAI reaches version 1.0.0, introducing enhanced guardrail handling for Agents and credential injection for tools, alongside several stability fixes for Flows and Docker environments.
1.0.0b33 fixes6 featuresThis release focuses on enhancing LLM completion classes (Bedrock, Gemini, Anthropic) with better parameter support and refactoring the project module to a metaclass for improved typing.
1.0.0b23 fixes3 featuresThis release focuses on enhancing the OpenAICompletion class, improving event bus thread safety, and fixing critical bugs related to input availability and JWT decoding.
1.0.0b14 fixes3 featuresThis release introduces Bedrock LLM integration and enhances OpenAICompletion parameters while improving event bus thread safety. It also includes several bug fixes related to JWT decoding, input availability, and task handling.
0.203.12 fixes1 featureThis release focuses on core improvements, including a fix for tool repository credential injection and enhanced JWT validation with a 10-second leeway.
1.0.0a46 fixes4 featuresThis release focuses on enhancing Agent event handling and local development triggers, alongside significant internal cleanup including logging improvements and Docker path resolution.
0.203.03 fixes6 featuresThis release focuses on enhancing the Agent class with better knowledge and guardrail handling, improving CI reliability through scheduled cache rebuilds, and significantly expanding documentation for tracing, HITL, and AMP.
1.0.0a35 fixes3 featuresRelease 1.0.0a3 focuses on improving the CI/CD pipeline, specifically enhancing the PyPI publishing process and adding base development tooling.
1.0.0a22 fixes4 featuresThis alpha release introduces tracing support, Braintrust integration, and improved schema parsing for complex JSON properties. It also adds mandatory environment variable validation for BrightData and various documentation updates.
1.0.0a1Breaking1 fix3 featuresThis alpha release transitions the project to a monorepo structure, updates the required Python version to 3.13, and introduces new configuration attributes to the Agent class.
0.201.1Breaking3 featuresThis release renames the Watson embedding provider to Watsonx, standardizes environment variable prefixes, and adds ChromaDB support for Watsonx and VoyageAI.
0.201.0Breaking5 fixes6 featuresThis release upgrades the core engine to Pydantic v2, introduces a new CLI wrapper for 'uv', and enhances embedder flexibility with batch processing and custom provider support.
0.193.21 fixThis release updates the pyproject templates to ensure the correct versioning is applied.
0.193.12 fixesThis release includes a series of minor bug fixes and improvements to the project's linting configuration.
0.193.0Breaking4 fixes8 featuresThis release introduces thread-safe context management and unified RAG storage while fixing critical initialization bugs in the OpenAI adapter and optimizing Mem0 metadata storage.
0.186.11 fixThis release updates CrewAI to version 0.186.1, addressing a bug where version detection could silently fail and updating CLI dependencies.
0.186.0Breaking3 fixes8 featuresThis release introduces RAG configuration enhancements, support for Qdrant and ChromaDB generic clients, and flow resumability. It also includes significant internal refactoring, typing modernization, and the deprecation of Task.max_retries.
0.177.0Breaking4 fixes5 featuresThis release focuses on core improvements including RAG package parity, centralized event handling under crewai.events, and critical bug fixes for mutable arguments and Pydantic warnings.
0.175.0Breaking5 fixes9 featuresThis release introduces a new RAG configuration system, Qdrant support, and improved Flow resumability while deprecating Task.max_retries. It also resolves import issues by updating the minimum OpenAI version requirement.
0.165.1Breaking4 fixes4 featuresThis release introduces Mem0 agent-linked memory and ExternalMemory metadata enhancements while removing the deprecated AgentOps integration. It also includes critical fixes for XMLSearchTool and Chroma storage handling.
0.165.0Breaking4 fixes4 featuresThis release introduces enhanced memory capabilities with Mem0 and ExternalMemory, stabilizes CI through telemetry mocking, and removes the deprecated AgentOps integration. It also includes critical fixes for XMLSearchTool and Chroma storage handling.
0.159.04 fixes6 featuresThis release introduces an enterprise configuration CLI command and partial flow resumability, alongside performance improvements for LLM message formatting and expanded multi-language documentation.
0.157.0Breaking2 fixes7 featuresThis release introduces a new CLI config command group, initial tracing capabilities, and LangDB integration while removing the deprecated User Memory system. It also includes performance optimizations and improved CLI error reporting.
0.152.0Breaking3 fixes3 featuresThis release refactors RAG components into a top-level module, replaces the signup command with a login command, and enhances the Flow class with custom naming support.
0.150.0Breaking6 fixes6 featuresThis release introduces Bedrock agent toolkits and the SerperScrapeWebsiteTool while upgrading Mem0 Storage to v2. It also includes critical fixes for Chroma initialization, Ollama message handling, and removes legacy SQLite workarounds.
0.148.05 fixes5 featuresThis release introduces comprehensive Agent evaluation tools and neatlogs integration, alongside critical fixes for agent knowledge handling and Task parameter configurations.
0.141.03 featuresThis release introduces crew context tracking for LLM guardrails and expands documentation for Agent repository usage and the kickoff method.
0.140.05 fixes6 featuresThis release introduces LLM call tracking, memory usage monitoring via MemoryEvents, and improved training support for 7B parameter models, alongside various CLI and RAG storage fixes.
0.134.06 fixes5 featuresThis release introduces official MCP Tools support within CrewBase and adds Oxylabs Web Scraping tools. It also includes critical fixes for Pydantic 2.7.x compatibility and SSL errors during LLM data retrieval.
0.130.05 fixes5 featuresThis release introduces LiteAgent with Guardrail integration, async tool execution, and support for Python 3.13. It also includes several core fixes for usage metrics, telemetry, and CLI multi-org actions.
0.126.02 fixes8 featuresThis release introduces Python 3.13 support, enhances MCP integration with streamable-http transport, and improves tool management and transparency. It also includes significant documentation restructuring and fixes for async execution and knowledge sources.
0.121.1No release notes provided.
0.121.04 fixes5 featuresThis release introduces new attributes to the Task and Agent classes, including reasoning and automatic date injection, while fixing tool encoding errors and improving documentation for MCP and StagehandTool.
0.120.11 fixThis release includes a bug fix for interpolation logic to correctly handle hyphens.
0.120.03 fixes5 featuresThis release introduces the ability to load Agents from repositories and direct knowledge initialization, while fixing critical race conditions in FilteredStream and agent reset logic.
0.119.05 fixes3 featuresThis release focuses on stability with fixes for memory crashes and telemetry, while introducing knowledge retrieval prompt re-writing and parent flow identification for Crew and LiteAgent.
0.118.0Breaking3 fixes1 featureThis release renames TaskGuardrail to LLMGuardrail, introduces no-code Guardrail creation, and fixes several core issues including template handling and logging overrides.
0.117.11 fixThis release focuses on dependency maintenance by upgrading crewai-tools and liteLLM, alongside a fix for Mem0 OSS.
0.117.09 fixes5 featuresThis release introduces support for new LLMs including Gemini 2.5 Pro and GPT-4.1, adds a result_as_answer parameter to tools, and provides significant bug fixes for memory management and async flows.
0.114.06 fixes9 featuresThis release introduces Agents as atomic units with direct kickoff capabilities, adds support for custom LLMs and Opik observability, and improves core performance and serialization.
0.108.03 fixes5 featuresThis release focuses on enhancing the LLM event system and streaming responses, while improving logging visualization and fixing Mistral model compatibility.
0.105.05 fixes6 featuresThis release introduces Flow state export, event emitters for observability, and support for Python 3.10 and o3-mini models. It also includes significant bug fixes for memory management, async flows, and template variables.
0.102.07 fixes7 featuresThis release introduces advanced knowledge management features, including multi-tab Excel support and a new Qdrant tool, while fixing critical stability issues in agent cloning, memory handling, and task callbacks.
0.100.03 fixes3 featuresThis release introduces SageMaker as an LLM provider and integrates Composio documentation, while addressing general LLM connection stability and training safety.
0.98.0Breaking9 fixes6 featuresThis release introduces Conversation Crew v1, flow state persistence via the @persist decorator, and several new integrations including SambaNova, NVIDIA NIM, and VoyageAI. It also includes critical fixes for tool input handling and Pydantic model nesting.