- Feature: Add count API endpoint for datastores and databases
- Feature: Add connections/status API endpoint
- Feature: Add connections/profiles API endpoint
- Feature: Add docker container for easy deployment
- Feature: Add phala deployment example
- Fix: Add missing init() calls in auth server
- Fix: Detect all shutdown signals and close server
- Feature: Support ongoing background syncing of user data
- Feature: Initial sync continues until no more data available
- Feature: Add support for API request usage tracking
- Feature: Support for depositing VDA for adding credits
- Feature: Support schema URI in in profile prompt endpoint
- Feature: Support new auth token based authentication
- Feature: Support auth token tracking who is paying (user or app)
- Fix: Timeout detection was always incorrect due to handler start log message
- Fix: Only log sync is starting if it passes all checks successfully
- Fix: Correctly handle document update conflicts caused when a sync is requested a second time, when already active
- Fix: Correctly populate handler sync start and end timestamps
- Feature: Support separate handler vs connection timeout config
- Feature: Add
/llm/agentendpoint that uses langchain agent with tools - Feature: Auto-summarize long emails when adding into RAG context window without using a LLM (fast and cheap)
- Feature: Support LLM internet access via
llm/agenttools - Feature: Ensure Agent based prompts always adhere to context window size
- Feature: Add github, slack, spotify, discord as upcoming connectors
- Feature: Output provider descriptions from the config in the
/providersendpoint
- Feature: Support create and update API endpoints
- Feature: Support new
personalProfileendpoint for creating JSON profiles based on user data - Feature: Include LLM response time in API responses
- Feature: Adding support for directly using AWS bedrock SDK to access Llama 3.x using inference profiles
- Feature: Support default provider and model in config
- Feature: Support LLM token limit, flagging no system prompt available for LLM, including LLM info in API responses
- Feature: Remove auto registration of new DIDs
- Feature: Make RAG more configurable via API endpoint parameters
- Fix: CORS was incorrectly configured
- Fix: Improve handling of JSON responses, auto-cleanup when LLM returns invalid JSON
- Fix: Escape HTML on client side to avoid JS code injection risks
- Fix: Calendar events not indexed and available for personal prompt
- Alpha release (Telegram and Google)