[BUG]: Can't change the output token limit with novita ai #3046
Labels
investigating
Core team or maintainer will or is currently looking into this issue
possible bug
Bug was reported but is not confirmed or is unable to be replicated.
How are you running AnythingLLM?
Docker (local)
What happened?
When experimenting with deepseek, Novita seems to limit the token output to 2048 when it should go up to 8196.
Novita provide an example of API implementation with the base limit :
Are there known steps to reproduce?
Ask any complex issue that'll likely prompt an output above 2048 tokens
The text was updated successfully, but these errors were encountered: