-
Notifications
You must be signed in to change notification settings - Fork 5
Open
Description
Every workflow (or almost every) will depend on an LLM service call. Sometimes this will fail due to unforseen circumstances.
- the key is not valid
- the key has reached a token limit
- the LLM service is down
- the context window was too large and the LLM inference broke
When something like that happens, we want to make sure that we handle it.
- in the backend, properly log the error with the message returned from the LLM service
- in the frontend, properly explain to the user that the issue was without leaking info should the LLM message have it. (best to have a few prepared messages and just say that)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels
Type
Projects
Status
Planning