Skip to content

Commit

Permalink
Properly close chat stream iterator even if response generation fails
Browse files Browse the repository at this point in the history
Previously chat stream iterator wasn't closed when response streaming
for offline chat model threw an exception.

This would require restarting the application. Now application doesn't
hang even if current response generation fails with exception
  • Loading branch information
debanjum committed Aug 23, 2024
1 parent bdb8126 commit 5927ca8
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/khoj/processor/conversation/offline/chat_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -224,7 +224,7 @@ def llm_thread(g, messages: List[ChatMessage], model: Any, max_prompt_size: int
g.send(response["choices"][0]["delta"].get("content", ""))
finally:
state.chat_lock.release()
g.close()
g.close()


def send_message_to_model_offline(
Expand Down

0 comments on commit 5927ca8

Please sign in to comment.