v2.1.0
🚀 Features / Enhancements
- Add new System Prompts module (example)
Add LangChain Embedding Function and extend embedding service with truncation parameter (general example, example for ChromaDB)
LangChain - LLMInterface construct now also accepts dictionaries next to Pydantic Models. - Add an example for creating LangChain Agent with Tools ( llama-2-70b-chat model) (example)
- Add an ability to inspect service method for metadata (for instance, to retrieve underlying endpoint) (example)
- Add support for the latest LangChain / LLamaIndex / Transformers version.
🐛 Bug Fixes
- LangChain - handles generation/streaming with custom prompt_id and data (example)
- Improve handling batching for large payloads (tokenization)
- Improve handling concurrency limits (text generation / embeddings)
🔧 General Changes
- Schemas are now exported from genai.schema (the old way of importing remains to work, but you will receive a warning; all examples are updated)
- Updated Documentation
- Added a version selector located in the left sidebar
- Added a copy button for examples
- Added the Changelog page (contains grouped list of commits and list of used API endpoints)
⬆️ How to upgrade?
Run pip install ibm-generative-ai --upgrade