Tabby chat
ComponentUpdates related to the chat component of Tabby.
All CHAT Features
- Enabled support for indexing GitLab Merge Requests as context for better code suggestions.(v0.30.0)
- Added support to convert Answer Engine messages into persistent, shareable Pages.(v0.28.0)
- Enabled Doc Query in the Chat Panel, allowing Developer Documentation to be included as context for responses.(v0.28.0)
- Added support for executing shell commands directly within the Chat Panel.(v0.27.0)
- Enabled the use of @changes context within the Chat Panel to include uncommitted changes in prompts.(v0.27.0)
- Introduced a security option to hide the password login interface on the frontend, requiring a URL parameter (`passwordSignIn=true`) to reveal it.(v0.27.0)
- Enabled the Answer Engine to access the repository's commit history for more context.(v0.26.0)
- Added support for displaying user chat history on both the Homepage and the Chat Side Panel.(v0.26.0)
- Exposed the thinking process of the Answer Engine within thread messages.(v0.25.0)
- Enabled the Answer Engine to access the repository's directory file list as needed.(v0.25.0)
- Enabled the use of the "@" symbol to mention a symbol in the Chat Sidebar.(v0.25.0)
- Provided repository-aware default question recommendations on the Answer Engine.(v0.25.0)
- Implemented LDAP Authentication Integration for secure access.(v0.24.0)
- Added Notifications for unsuccessful background jobs to keep users informed.(v0.24.0)
All CHAT Bug Fixes
- Fixed the issue where buttons within code snippets in a chat response would flicker during answer generation.(v0.30.0)
- Resolved the problem encountered when loading a multi-part model from a local source.(v0.30.0)
- Fixed plugin crashes on Windows systems caused by certain escaped characters.(v0.27.1)
- Resolved hanging of the Tabby server while waiting for the registry file to download in offline environments.(v0.27.1)
- Resolved an issue where chat functionality failed when using OpenAI reasoning models like `o3-mini` and `o1-mini`.(v0.27.0)
- Resolved the deserialization issue related to "finish_reason" when receiving chat responses from the LiteLLM Proxy Server.(v0.25.0)
- Fixed a bug that prevented the client code context in historical messages from being added to the prompt.(v0.24.0)
- Resolved an issue causing integration errors when using recent versions of Jan AI.(v0.24.0)
- Resolved an issue where repositories specified in config.toml were not synchronizing correctly.(v0.24.0)
- Resolved an issue that caused model download failures due to changes in the HuggingFace API.(v0.24.0)
Releases with CHAT Changes
v0.30.01 feature2 fixesThis release introduces the ability to index GitLab Merge Requests as context, significantly improving suggestion relevance. Key improvements include leveraging the Answer Engine for better page generation quality and resolving UI flickering issues in chat responses. Note that the default Docker image now requires an NVIDIA GPU.
v0.28.02 featuresThis release introduces the powerful "Convert to Page" feature, allowing users to transform Answer Engine responses into organized, shareable Pages. Additionally, support for Doc Query in the Chat Panel has been added, enabling the use of Developer Documentation as context.
v0.27.12 fixesThis patch release focuses on stability and reliability, resolving critical issues that affected Windows users and offline server operations. Users should also review the full release notes for version 0.27 for broader context.
v0.27.03 features1 fixThis release introduces powerful new capabilities for the Chat Panel, allowing users to execute shell commands and include uncommitted changes in context. Key fixes include resolving chat functionality issues with specific OpenAI reasoning models. Additionally, background job logging has been improved for better stability with large repositories.
v0.26.02 featuresThis release introduces significant context capabilities by allowing the Answer Engine to access commit history. Users will also notice that their chat history is now visible on the Homepage and Side Panel. Additionally, documentation crawling has been improved when using llms-full.txt.
v0.25.1This is a patch release focusing on minor refinements rather than major new features or bug fixes. Users should review the full release notes for version 0.25.0 for comprehensive details. The primary visible change is an update to the Answer Engine UI.
v0.25.04 features1 fixThis release introduces significant new capabilities for the Answer Engine, including exposing its thinking process and enabling repository file access. Users will also benefit from improved stability with automatic embedding retries and an enhanced user interface experience.
v0.24.02 features4 fixesThis release introduces LDAP Authentication Integration and adds notifications for failed background jobs. Several critical bugs were fixed, including issues with historical context in prompts, Jan AI integration, and model downloads from HuggingFace. Performance improvements include limiting history retention and optimizing GitHub PR diff indexing.