[BUG]: Embedded Chat Widget - not considering Query mode option. Always working in Chat mode. #3147
Labels
Docker
Issue related to Docker
investigating
Core team or maintainer will or is currently looking into this issue
possible bug
Bug was reported but is not confirmed or is unable to be replicated.
How are you running AnythingLLM?
Docker (local)
What happened?
What happened?
Description
There appears to be a discrepancy in the responses generated by AnythingLLM when interacting through its frontend (chat widget) versus using its Workspace interfaces. Specifically, the Chat widget seems to disregard pinned documents and possibly the Retrieval-Augmented Generation (RAG) workflow, leading to inconsistencies in behavior and response content.
Observed Behavior
Frontend (Workspace):
Pinned Documents are working perfectly when using the workspace and considering the content from documents.
The RAG workflow is followed, producing relevant and coherent answers that align with the intended setup.
Chat Widget:
Responses differ significantly from those generated by the frontend.
Pinned documents and embedded information do not appear to be factored into the responses.
It seems as though the RAG process is either bypassed.
Expected Behavior
Consistent behavior and responses across both the AnythingLLM frontend and its Chatbot based on the configurations should work.
Seems always the Chat Mode is selected irrespective of Query mode selection at Workspace level or Embeded Code level.
Are there known steps to reproduce?
Uploaded the documents and pinned them.
Workspace is configured with mode as Query mode to consider only the documents. And it RAG system works.
Creating new Embed Chat Widget from same workspace and no changes in setting. Chat Widget suppose to work based on the Chat mode setup. But always works in Chat Mode only irrespective query mode.
Please advise on this issue
The text was updated successfully, but these errors were encountered: