Change8

Cline tools

Component

Updates related to the tools component of Cline.

7 releases27 features42 bug fixesView all Cline releases

All TOOLS Features

  • Added support for skills and optional modelId in subagent configuration(v3.67.0)
  • Introduced AgentConfigLoader for loading agent configurations from files(v3.67.0)
  • Enabled Responses API support for the OpenAI native provider(v3.67.0)
  • Added /q command to quit the Command Line Interface (CLI)(v3.67.0)
  • Added dynamic flag to adjust banner cache duration(v3.67.0)
  • Introduced native `use_subagents` tool to replace legacy subagents.(v3.58.0)
  • Enabled support for bundling `endpoints.json` so packaged distributions can ship required endpoints.(v3.58.0)
  • Added support for parallel tool calling with Amazon Bedrock.(v3.58.0)
  • Introduced a new experimental "double-check completion" feature to verify work before marking tasks complete.(v3.58.0)
  • Added new task controls/flags to the CLI, including custom `--thinking` token budget and `--max-consecutive-mistakes` for yolo runs.(v3.58.0)
  • Added new UI/options (including connection/test buttons) and support for syncing deletion of remotely configured MCP servers via remote config.(v3.58.0)
  • Enabled 1M context model options for Claude Opus 4.6 in Vertex / Claude Code.(v3.58.0)
  • Added support for ZAI/GLM: GLM-5.(v3.58.0)
  • Introduced Cline CLI 2.0, installable via `npm install -g cline`(v3.57.0)
  • Enabled access to Anthropic Opus 4.6 model(v3.57.0)
  • Enabled access to Minimax-2.1 and Kimi-k2.5 models for a limited-time free promotion(v3.57.0)
  • Enabled access to Codex-5.3 model for ChatGPT subscribers(v3.57.0)
  • Added support for the gpt-5.2-codex OpenAI model.(v3.50.0)
  • Introduced a new create-pull-request skill for automated PR generation.(v3.50.0)
  • Enabled phased rollout of Responses API usage instead of defaulting for every supported model.(v3.49.1)
  • Added experimental support for Background Edits, allowing file editing without opening the diff view(v3.47.0)
  • Updated the free model to MiniMax M2.1, replacing MiniMax M2(v3.47.0)
  • Added support for Azure based identity authentication in OpenAI Compatible provider and Azure OpenAI(v3.47.0)
  • Added the `supportsReasoning` property to Baseten models(v3.47.0)
  • Added support for the new GLM 4.7 model.(v3.46.0)
  • Introduced the Apply Patch tool for GPT-5+ models, replacing previous diff edit tools.(v3.46.0)
  • Enhanced background terminal execution with command tracking, log file output, zombie process prevention (10-minute timeout), and clickable log paths in the UI.(v3.46.0)

All TOOLS Bug Fixes

  • Fixed a crash related to reasoning delta when processing usage-only stream chunks(v3.67.0)
  • Corrected an issue where OpenAI tool ID transformation was incorrectly restricted to the native provider only(v3.67.0)
  • Fixed authentication check failure when operating in ACP mode(v3.67.0)
  • Resolved an issue where the CLI yolo setting was being incorrectly persisted to disk(v3.67.0)
  • Fixed the inline focus-chain slider to correctly operate within its feature row(v3.67.0)
  • Ensured compatibility with Gemini 3.1 Pro(v3.67.0)
  • Corrected authentication issues when using the Cline CLI with the ACP flag(v3.67.0)
  • Fixed CLI to correctly handle stdin redirection in CI/headless environments.(v3.58.0)
  • Fixed CLI to preserve OAuth callback paths during auth redirects.(v3.58.0)
  • Fixed OAuth callback reliability in VS Code Web by generating callback URLs via `vscode.env.asExternalUri`.(v3.58.0)
  • Fixed Terminal to surface command exit codes in results and improved long-running `execute_command` timeout behavior.(v3.58.0)
  • Fixed UI rendering by adding a loading indicator and fixing `api_req_started` rendering.(v3.58.0)
  • Prevented duplicate streamed text rows after task completion during task streaming.(v3.58.0)
  • Fixed API to preserve the selected Vercel model when model metadata is missing.(v3.58.0)
  • Ensured telemetry flushes on shutdown and routed PostHog networking through proxy-aware shared fetch.(v3.58.0)
  • Increased Windows E2E test timeout to reduce flakiness in CI.(v3.58.0)
  • Fixed issue where the read file tool failed to support reading large files(v3.57.0)
  • Resolved crash occurring when entering decimal input in OpenAI Compatible price fields (#8129)(v3.57.0)
  • Corrected build complete handlers when updating the API configuration(v3.57.0)
  • Fixed missing provider from the model list(v3.57.0)
  • Fixed issue where the Favorite Icon/Star was getting clipped in the task history view(v3.57.0)
  • Fixed an issue where remotely configured providers were not being selected correctly.(v3.50.0)
  • Resolved a bug where act_mode_respond could result in consecutive, unnecessary calls.(v3.50.0)
  • Fixed incorrect tool call IDs when switching between different model formats.(v3.50.0)
  • Fixed workflow slash command search to be case-insensitive.(v3.49.1)
  • Fixed incorrect model display in ModelPickerModal when using LiteLLM.(v3.49.1)
  • Fixed LiteLLM model fetching when using the default base URL.(v3.49.1)
  • Fixed crash that occurred when OpenAI-compatible APIs sent usage chunks with empty or null choices arrays at the end of streaming.(v3.49.1)
  • Fixed incorrect model ID for the Kat Coder Pro Free model.(v3.49.1)
  • Fixed issue where expired tokens were being used in authenticated requests(v3.47.0)
  • Fixed diff view to correctly exclude binary files that lack file extensions(v3.47.0)
  • Fixed issue where file endings and trailing newlines were not being preserved(v3.47.0)
  • Fixed rate limiting issues specific to the Cerebras provider(v3.47.0)
  • Fixed Auto Compact functionality for the Claude Code provider(v3.47.0)
  • Made Workspace and Favorites history filters operate independently(v3.47.0)
  • Fixed remote MCP server connection failures, specifically handling 404 responses(v3.47.0)
  • Disabled native tool calling for the Deepseek 3.2 speciale model(v3.47.0)
  • Fixed the Baseten model selector interface(v3.47.0)
  • Fixed duplicate error messages that occurred during streaming for the Diff Edit tool when Parallel Tool Calling was disabled.(v3.46.0)
  • Corrected typos found in Gemini system prompt overrides.(v3.46.0)
  • Resolved issues with the model picker favorites ordering, star toggle functionality, and keyboard navigation for OpenRouter and Vercel AI Gateway providers.(v3.46.0)
  • Fixed an issue where remote configuration values were not being fetched from the cache.(v3.46.0)

Releases with TOOLS Changes

v3.67.05 features7 fixes
Feb 24, 2026

This release introduces significant configuration flexibility by adding support for skills and file-based agent configurations via the new AgentConfigLoader. Key fixes address stability issues, including a reasoning delta crash and compatibility problems with Gemini 3.1 Pro. Users will also benefit from reduced response latency due to websocket preconnection.

v3.58.08 features9 fixes
Feb 12, 2026

This release introduces significant new capabilities, including native subagent support, parallel tool calling for Amazon Bedrock, and an experimental double-check feature for task verification. Several critical bugs were fixed across the CLI, VS Code Web, and Terminal, alongside UX improvements like moving reasoning effort into model settings.

v3.57.04 features5 fixes
Feb 5, 2026

This release introduces Cline CLI 2.0 and expands model availability, adding Anthropic Opus 4.6, Minimax-2.1, Kimi-k2.5, and Codex-5.3. Several critical bugs were fixed, including issues with reading large files and crashes in price input fields, alongside a UX change making skills permanently enabled.

v3.50.02 features3 fixes
Jan 14, 2026

This release introduces significant new capabilities, including support for the gpt-5.2-codex model and a new create-pull-request skill. Several critical bugs related to provider selection and tool call handling have also been resolved.

v3.49.11 feature5 fixes
Jan 13, 2026

This release focuses primarily on stability and compatibility fixes, addressing issues with model selection, streaming API responses, and improving the search functionality for workflow slash commands. It also introduces a phased rollout for the Responses API usage.

v3.47.04 features9 fixes
Jan 6, 2026

This release introduces experimental Background Edits for non-blocking file modifications and updates the free model to MiniMax M2.1. Several stability fixes address issues with token usage, provider rate limiting (Cerebras), and connection handling for remote MCP servers.

v3.46.03 features4 fixes
Dec 22, 2025

This release introduces the new GLM 4.7 model and the Apply Patch tool for GPT-5+ models. We have significantly enhanced background terminal execution capabilities and resolved several bugs related to model picker navigation and error message duplication.

Documentation

Read the tools documentation