Skip to content

Commit a1c8e74

Browse files
committed
update langchain intro
1 parent 03fb84c commit a1c8e74

File tree

1 file changed

+29
-1
lines changed

1 file changed

+29
-1
lines changed

langchain.mdx

Lines changed: 29 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,14 @@ Create a new Python file named `app.py` in your project directory. This file wil
2121

2222
In `app.py`, import the necessary packages and define a factory function decorated with [@cl.langchain_factory](/api-reference/langchain/langchain-factory) that returns any LangChain instance. In this tutorial, we are going to use `LLMChain` to keep it simple. Here's the basic structure of the script:
2323

24-
```python
24+
<Note>
25+
If your agent does not have an async implementation, fallback to the sync
26+
implementation.
27+
</Note>
28+
29+
<CodeGroup>
30+
31+
```python Async
2532
import os
2633
from langchain import PromptTemplate, OpenAI, LLMChain
2734
import chainlit as cl
@@ -40,6 +47,27 @@ def factory():
4047
return llm_chain
4148
```
4249

50+
```python Sync
51+
import os
52+
from langchain import PromptTemplate, OpenAI, LLMChain
53+
import chainlit as cl
54+
55+
os.environ["OPENAI_API_KEY"] = "YOUR_OPEN_AI_API_KEY"
56+
57+
template = """Question: {question}
58+
59+
Answer: Let's think step by step."""
60+
61+
@cl.langchain_factory(use_async=False)
62+
def factory():
63+
prompt = PromptTemplate(template=template, input_variables=["question"])
64+
llm_chain = LLMChain(prompt=prompt, llm=OpenAI(temperature=0), verbose=True)
65+
66+
return llm_chain
67+
```
68+
69+
</CodeGroup>
70+
4371
This function sets up an instance of `LLMChain` with a custom `PromptTemplate`. The `LLMChain` is responsible for generating responses based on the input provided by users.
4472

4573
Behind the scenes, Chainlit takes care of:

0 commit comments

Comments
 (0)