- Fix circular dependencies in agent/tools - Migrate from custom JSON to OpenAI tool calls format - Add async streaming (step_stream, complete_stream) - Simplify prompt system and remove token counting - Add 5 new API endpoints (/health, /v1/models, /api/memory/*) - Add 3 new tools (get_torrent_by_index, add_torrent_by_index, set_language) - Fix all 500 tests and add coverage config (80% threshold) - Add comprehensive docs (README, pytest guide) BREAKING: LLM interface changed, memory injection via get_memory()
20 lines
312 B
Python
20 lines
312 B
Python
"""LLM-related exceptions."""
|
|
|
|
|
|
class LLMError(Exception):
|
|
"""Base exception for LLM-related errors."""
|
|
|
|
pass
|
|
|
|
|
|
class LLMConfigurationError(LLMError):
|
|
"""Raised when LLM is not properly configured."""
|
|
|
|
pass
|
|
|
|
|
|
class LLMAPIError(LLMError):
|
|
"""Raised when LLM API returns an error."""
|
|
|
|
pass
|