Change8

Cline terminal

Component

Updates related to the terminal component of Cline.

11 releases18 features28 bug fixesView all Cline releases

All TERMINAL Features

  • Added support for skills and optional modelId in subagent configuration(v3.67.0)
  • Introduced AgentConfigLoader for loading agent configurations from files(v3.67.0)
  • Enabled Responses API support for the OpenAI native provider(v3.67.0)
  • Added /q command to quit the Command Line Interface (CLI)(v3.67.0)
  • Added dynamic flag to adjust banner cache duration(v3.67.0)
  • Added the /skills slash command to the CLI for viewing and managing installed skills.(v3.65.0)
  • Introduced native `use_subagents` tool to replace legacy subagents.(v3.58.0)
  • Enabled support for bundling `endpoints.json` so packaged distributions can ship required endpoints.(v3.58.0)
  • Added support for parallel tool calling with Amazon Bedrock.(v3.58.0)
  • Introduced a new experimental "double-check completion" feature to verify work before marking tasks complete.(v3.58.0)
  • Added new task controls/flags to the CLI, including custom `--thinking` token budget and `--max-consecutive-mistakes` for yolo runs.(v3.58.0)
  • Added new UI/options (including connection/test buttons) and support for syncing deletion of remotely configured MCP servers via remote config.(v3.58.0)
  • Enabled 1M context model options for Claude Opus 4.6 in Vertex / Claude Code.(v3.58.0)
  • Added support for ZAI/GLM: GLM-5.(v3.58.0)
  • Enabled phased rollout of Responses API usage instead of defaulting for every supported model.(v3.49.1)
  • Added support for the new GLM 4.7 model.(v3.46.0)
  • Introduced the Apply Patch tool for GPT-5+ models, replacing previous diff edit tools.(v3.46.0)
  • Enhanced background terminal execution with command tracking, log file output, zombie process prevention (10-minute timeout), and clickable log paths in the UI.(v3.46.0)

All TERMINAL Bug Fixes

  • Fixed a crash related to reasoning delta when processing usage-only stream chunks(v3.67.0)
  • Corrected an issue where OpenAI tool ID transformation was incorrectly restricted to the native provider only(v3.67.0)
  • Fixed authentication check failure when operating in ACP mode(v3.67.0)
  • Resolved an issue where the CLI yolo setting was being incorrectly persisted to disk(v3.67.0)
  • Fixed the inline focus-chain slider to correctly operate within its feature row(v3.67.0)
  • Ensured compatibility with Gemini 3.1 Pro(v3.67.0)
  • Corrected authentication issues when using the Cline CLI with the ACP flag(v3.67.0)
  • Fixed aggressive context compaction caused by accidental clicks on the context window progress bar that silently set a very low auto-condense threshold.(v3.65.0)
  • Fixed an infinite retry loop that occurred when write_to_file failed due to a missing content parameter.(v3.65.0)
  • Fixed the default Claude model selection.(v3.65.0)
  • Fixed CLI to correctly handle stdin redirection in CI/headless environments.(v3.58.0)
  • Fixed CLI to preserve OAuth callback paths during auth redirects.(v3.58.0)
  • Fixed OAuth callback reliability in VS Code Web by generating callback URLs via `vscode.env.asExternalUri`.(v3.58.0)
  • Fixed Terminal to surface command exit codes in results and improved long-running `execute_command` timeout behavior.(v3.58.0)
  • Fixed UI rendering by adding a loading indicator and fixing `api_req_started` rendering.(v3.58.0)
  • Prevented duplicate streamed text rows after task completion during task streaming.(v3.58.0)
  • Fixed API to preserve the selected Vercel model when model metadata is missing.(v3.58.0)
  • Ensured telemetry flushes on shutdown and routed PostHog networking through proxy-aware shared fetch.(v3.58.0)
  • Increased Windows E2E test timeout to reduce flakiness in CI.(v3.58.0)
  • Fixed workflow slash command search to be case-insensitive.(v3.49.1)
  • Fixed incorrect model display in ModelPickerModal when using LiteLLM.(v3.49.1)
  • Fixed LiteLLM model fetching when using the default base URL.(v3.49.1)
  • Fixed crash that occurred when OpenAI-compatible APIs sent usage chunks with empty or null choices arrays at the end of streaming.(v3.49.1)
  • Fixed incorrect model ID for the Kat Coder Pro Free model.(v3.49.1)
  • Fixed duplicate error messages that occurred during streaming for the Diff Edit tool when Parallel Tool Calling was disabled.(v3.46.0)
  • Corrected typos found in Gemini system prompt overrides.(v3.46.0)
  • Resolved issues with the model picker favorites ordering, star toggle functionality, and keyboard navigation for OpenRouter and Vercel AI Gateway providers.(v3.46.0)
  • Fixed an issue where remote configuration values were not being fetched from the cache.(v3.46.0)

Releases with TERMINAL Changes

v3.67.05 features7 fixes
Feb 24, 2026

This release introduces significant configuration flexibility by adding support for skills and file-based agent configurations via the new AgentConfigLoader. Key fixes address stability issues, including a reasoning delta crash and compatibility problems with Gemini 3.1 Pro. Users will also benefit from reduced response latency due to websocket preconnection.

v3.65.01 feature3 fixes
Feb 18, 2026

This release introduces the new /skills slash command in the CLI, allowing users to easily view and manage their installed skills. Several critical bugs have been resolved, including issues related to context window management, file writing retries, and the default model selection.

cli-build-024bb65
Feb 12, 2026

This release is an automated CLI build from a specific commit (024bb65). No specific user-facing features, bug fixes, or improvements were detailed in these release notes beyond the build artifact itself.

v3.58.08 features9 fixes
Feb 12, 2026

This release introduces significant new capabilities, including native subagent support, parallel tool calling for Amazon Bedrock, and an experimental double-check feature for task verification. Several critical bugs were fixed across the CLI, VS Code Web, and Terminal, alongside UX improvements like moving reasoning effort into model settings.

cli-build-6278382
Feb 8, 2026

This release appears to be a routine build update from a specific commit (6278382). No user-facing features or bug fixes were explicitly detailed in these notes, suggesting it might be a maintenance or internal build.

cli-build-a03642b
Feb 8, 2026

This release is an automated CLI build from a specific commit (a03642b). No user-facing features, bug fixes, or improvements were explicitly detailed in these release notes.

cli-build-b2bfc3d
Feb 8, 2026

This release is an automated CLI build from commit b2bfc3d. No specific user-facing features, fixes, or improvements were detailed in these release notes.

cli-build-6b07e8c
Feb 7, 2026

This release appears to be a routine automated build from a specific commit (6b07e8c). No user-facing features, bug fixes, or improvements were explicitly detailed in these notes.

cli-build-4d455ea
Feb 7, 2026

This release is a direct build from commit 4d455ea. No specific user-facing features, fixes, or improvements were detailed in these release notes, only the installation instructions for this specific build.

v3.49.11 feature5 fixes
Jan 13, 2026

This release focuses primarily on stability and compatibility fixes, addressing issues with model selection, streaming API responses, and improving the search functionality for workflow slash commands. It also introduces a phased rollout for the Responses API usage.

v3.46.03 features4 fixes
Dec 22, 2025

This release introduces the new GLM 4.7 model and the Apply Patch tool for GPT-5+ models. We have significantly enhanced background terminal execution capabilities and resolved several bugs related to model picker navigation and error message duplication.