You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, AFFiNE uses the OpenAI provider by default, which limits users to a specific set of models. I propose adding support for additional LLM providers, ideally through integration with openrouter.ai, so that users can choose from a vast variety of language models, including DeepSeek, without being locked into one provider.
Use case
This feature would allow users to select the best-suited LLM for their needs. For example, I want to integrate the DeepSeek API into my AFFiNE self-hosted server, but the hardcoded model validation in the current OpenAI provider source code prevents using my preferred model. Supporting openrouter.ai would offer more flexibility and better meet diverse user requirements.
Anything else?
I really appreciate the solid work done on AFFiNE, especially the innovative Frames feature, which is both practical and creative. It’s clear that a lot of thought and effort has gone into making AFFiNE a powerful tool, and having it as an open source project is an incredible resource for the community. The Frames feature even reminds me somewhat of Prezi, providing a dynamic and engaging way to present content. Supporting additional LLMs would significantly enhance AFFiNE's versatility and overall value, enabling more tailored integrations for different use cases without locking users into a single provider.
Are you willing to submit a PR?
Yes I'd like to help by submitting a PR!
The text was updated successfully, but these errors were encountered:
Description
Currently, AFFiNE uses the OpenAI provider by default, which limits users to a specific set of models. I propose adding support for additional LLM providers, ideally through integration with openrouter.ai, so that users can choose from a vast variety of language models, including DeepSeek, without being locked into one provider.
Use case
This feature would allow users to select the best-suited LLM for their needs. For example, I want to integrate the DeepSeek API into my AFFiNE self-hosted server, but the hardcoded model validation in the current OpenAI provider source code prevents using my preferred model. Supporting openrouter.ai would offer more flexibility and better meet diverse user requirements.
Anything else?
I really appreciate the solid work done on AFFiNE, especially the innovative Frames feature, which is both practical and creative. It’s clear that a lot of thought and effort has gone into making AFFiNE a powerful tool, and having it as an open source project is an incredible resource for the community. The Frames feature even reminds me somewhat of Prezi, providing a dynamic and engaging way to present content. Supporting additional LLMs would significantly enhance AFFiNE's versatility and overall value, enabling more tailored integrations for different use cases without locking users into a single provider.
Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: