-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issues running ChatPromptBuilder in a serialized pipeline #61
Comments
Hello! As I mentioned on Discord, this is related to the recent ChatMessage refactoring. If I use Haystack 2.9.0, your code produces the following YAML (please note that I am using components:
llm:
init_parameters:
chat_template: null
generation_kwargs:
max_new_tokens: 150
return_full_text: false
stop_sequences: []
huggingface_pipeline_kwargs:
model: Qwen/Qwen2.5-1.5B-Instruct
task: text-generation
streaming_callback: null
token:
env_vars:
- HF_API_TOKEN
- HF_TOKEN
strict: false
type: env_var
type: haystack.components.generators.chat.hugging_face_local.HuggingFaceLocalChatGenerator
prompt_builder:
init_parameters:
required_variables: null
template:
- _content:
- text: "\n Please create a summary about the following\
\ topic:\n {{ topic }}\n "
_meta: {}
_name: null
_role: user
variables: null
type: haystack.components.builders.chat_prompt_builder.ChatPromptBuilder
connections:
- receiver: llm.messages
sender: prompt_builder.prompt
max_runs_per_component: 100
metadata: {} (The template serialization differs) This pipeline can be successfully deployed on Hayhooks. The example request body (in API docs) is {
"llm": {
"generation_kwargs": {}
},
"prompt_builder": {
"topic": "",
"template": [
{
"_role": "user",
"_content": [
{
"text": "string"
},
{
"tool_name": "string",
"arguments": {},
"id": "string"
},
{
"result": "string",
"origin": {
"tool_name": "string",
"arguments": {},
"id": "string"
},
"error": true
}
],
"_name": "string",
"_meta": {}
}
],
"template_variables": {}
}
} Using {
"llm": {
"generation_kwargs": {}
},
"prompt_builder": {
"topic": "NLP"
}
} I get a correct response: {
"llm": {
"replies": [
{
"_role": "assistant",
"_content": [
{
"text": "Sure! Here's a concise summary of NLP..."
}
],
"_name": null,
"_meta": {
"finish_reason": "length",
"index": 0,
"model": "Qwen/Qwen2.5-1.5B-Instruct",
"usage": {
"completion_tokens": 150,
"prompt_tokens": 44,
"total_tokens": 194
}
}
}
]
}
} Let me know if this helps... |
This helps - thank you! |
Hello team,
I am hoping you can help me identify what I am doing wrong when I make a post request to my pipelines. I was able to deploy the pipeline but I am having issues making a post request. I thought I might simply want to pass the topic but I get an error
ValueError: The ChatPromptBuilder expects a list containing only ChatMessage instances. The provided list contains other types. Please ensure that all elements in the list are ChatMessage instances.
Here is a simple pipeline I built
The serialized version
Sample post command
The text was updated successfully, but these errors were encountered: