v0.4.3
📦 ragasView on GitHub →
✨ 6 features🐛 7 fixes🔧 8 symbols
Summary
This release introduces advanced prompt optimization via DSPyOptimizer, adds system prompt support for several LLM wrappers, and includes several bug fixes related to caching and CI configuration.
✨ New Features
- Added DSPyOptimizer with MIPROv2 for advanced prompt optimization.
- Added llms.txt generation for LLM-friendly documentation.
- Added dspy caching.
- Added system prompt support for InstructorLLM and LiteLLMStructuredLLM.
- Added copy-to-llm button for easy AI tool integration.
- Added remaining quickstart templates.
🐛 Bug Fixes
- Fixed CI to use PAT token for docs-check similar to docs-apply.
- Enabled FactualCorrectness language adaptation.
- Resolved DiskCacheBackend pickling issue when used with InstructorLLM.
- Lazy initialized DEFAULT_TOKENIZER to avoid network calls during import.
- Added ability to comment on failed tasks.
- Fixed DiscreteMetric llm examples in documentation to match the API.
- Added repository parameter to checkout action to support fork PRs.