v1.0.63-jetbrains
ContinueSummary
This release introduces significant expansion in LLM provider support, adding Tensorix and enabling Gemini via the AI SDK, alongside Bedrock API key authentication. Key fixes address issues with terminal link resolution, system message parsing, and improved stability for various models like DeepSeek and Ollama. Users will also benefit from new CLI capabilities, including session import/export and invokable skills.
New Features
- Added Tensorix as a new LLM provider.
- Enabled discovery of .continue/checks/ in continuous integration (CN) review mode via the CLI.
- Introduced invokable skills and the ability to import any skill via the CLI.
- Enabled export and import functionality for chat sessions via the CLI.
- Added support for Qwen multi-file FIM template for repository-level autocompletion.
- Enabled support for Gemini models through the AI SDK.
- Added support for Bedrock API key authentication.
Bug Fixes
- Fixed terminal links opening incorrect URLs when addresses contained ports.
- Fixed removal of the hardcoded Unix $ prompt prefix from the terminal UI.
- Fixed loading of rules into the system message.
- Strengthened the default Apply prompt used for local models.
- Fixed skipping remote URIs when resolving the Multi-Cloud Platform (MCP) server current working directory (cwd).
- Fixed inclusion of the reasoning_content field for DeepSeek Reasoner models.
- Added a default timeout for terminal command tool execution.
- Fixed incorrect location resolution for the CLI configuration file in the registry client.
- Fixed system-message tools parser when a tool call was non-terminal.
- Fixed Ollama MCP tool calling for Mistral and Gemma3 models.
- Fixed OpenAI API 400 errors related to reasoning, tool calls, and ID handling.
- Fixed update of the model catalog for retired and new Gemini models.
- Fixed expansion of tool support to reduce 'Invalid tool name' errors.
- Fixed handling of thinking/assistant messages appearing at the end of chat history.
- Guarded against non-string values for the NO_PROXY environment variable.
Improvements
- Hardened system message tools and wired toolOverrides to the system message path.
- Improved SSL certificate troubleshooting guidance documentation.
- Documented all CLI slash commands available in TUI mode.
- Updated default LLM configurations: removed Gemini 2.0 Flash and updated Claude defaults to 4.6.
- Sent x-api-key instead of api-key as the header for Azure hosted Anthropic service.
- Updated JetBrains and VS Code versions (63 and 34 respectively).
- Improved error handling UX and moved stream retry logic to BaseLLM.
- Preserved indentation when applying code edits to Python files.
- Used the config.yaml name for the default Local Config profile.
- Restricted terminal childProcess.spawn to local-only environments.