Lunary helps developers of LLM Chatbots develop and improve them.
- 🖲️ Conversation & feedback tracking
- 💵 Analytics (costs, token, latency, ..)
- 🔍 Debugging (logs, traces, user tracking, ..)
- ⛩️ Prompt Directory (versioning, team collaboration, ..)
- 🏷️ Create fine-tuning datasets
- 🧪 Automatic topic classification
It also designed to be:
- 🤖 Usable with any model, not just OpenAI
- 📦 Easy to integrate (2 minutes)
- 🧑💻 Self-hostable
demo720.mp4
Modules available for:
Lunary natively supports:
- LangChain (JS & Python)
- OpenAI module
- LiteLLM
- Flowise
Additionally you can use it with any other LLM by manually sending events.
Full documentation is available on the website.
We offer a hosted version with a free plan of up to 10k requests / month.
With the hosted version:
- 👷 don't worry about devops or managing updates
- 🙋 get priority 1:1 support with our team
- 🇪🇺 your data is stored safely in Europe
- Clone the repository
- Setup a PostgreSQL instance (version 15 minimum)
- Copy the content of
packages/backend/.env.example
topackages/backend/.env
and fill the missing values - Copy the content of
packages/frontend/.env.example
topackages/frontend/.env
and fill the missing values - Run
npm install
- Run
npm run migrate:db
- Run
npm run dev
You can now open the dashboard at http://localhost:8080
.
When using our JS or Python SDK, you need to set the environment variable LUNARY_API_URL
to http://localhost:3333
. You can use LUNARY_VERBOSE=True
to see all the event sent by the SDK
Need help or have questions? Chat with us on the website or email us: hello [at] lunary.ai. We're here to help every step of the way.
This project is licensed under the Apache 2.0 License.