Skip to content

Latest commit

 

History

History
61 lines (50 loc) · 2.7 KB

File metadata and controls

61 lines (50 loc) · 2.7 KB

2024-02-27 (v1.2.0)

  • Feature: Add count API endpoint for datastores and databases
  • Feature: Add connections/status API endpoint
  • Feature: Add connections/profiles API endpoint
  • Feature: Add docker container for easy deployment
  • Feature: Add phala deployment example
  • Fix: Add missing init() calls in auth server
  • Fix: Detect all shutdown signals and close server

2024-02-27 (v1.1.0)

  • Feature: Support ongoing background syncing of user data
  • Feature: Initial sync continues until no more data available

2024-02-04 (v1.0.0)

  • Feature: Add support for API request usage tracking
  • Feature: Support for depositing VDA for adding credits
  • Feature: Support schema URI in in profile prompt endpoint
  • Feature: Support new auth token based authentication
  • Feature: Support auth token tracking who is paying (user or app)

2024-12-24 (v0.3.0)

  • Fix: Timeout detection was always incorrect due to handler start log message
  • Fix: Only log sync is starting if it passes all checks successfully
  • Fix: Correctly handle document update conflicts caused when a sync is requested a second time, when already active
  • Fix: Correctly populate handler sync start and end timestamps
  • Feature: Support separate handler vs connection timeout config
  • Feature: Add /llm/agent endpoint that uses langchain agent with tools
  • Feature: Auto-summarize long emails when adding into RAG context window without using a LLM (fast and cheap)
  • Feature: Support LLM internet access via llm/agent tools
  • Feature: Ensure Agent based prompts always adhere to context window size
  • Feature: Add github, slack, spotify, discord as upcoming connectors
  • Feature: Output provider descriptions from the config in the /providers endpoint

2024-11-25 (v0.2.0)

  • Feature: Support create and update API endpoints
  • Feature: Support new personalProfile endpoint for creating JSON profiles based on user data
  • Feature: Include LLM response time in API responses
  • Feature: Adding support for directly using AWS bedrock SDK to access Llama 3.x using inference profiles
  • Feature: Support default provider and model in config
  • Feature: Support LLM token limit, flagging no system prompt available for LLM, including LLM info in API responses
  • Feature: Remove auto registration of new DIDs
  • Feature: Make RAG more configurable via API endpoint parameters
  • Fix: CORS was incorrectly configured
  • Fix: Improve handling of JSON responses, auto-cleanup when LLM returns invalid JSON
  • Fix: Escape HTML on client side to avoid JS code injection risks
  • Fix: Calendar events not indexed and available for personal prompt

2024-09-30

  • Alpha release (Telegram and Google)