Skip to content

Fix for qwen model prompt (#514) #29

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Mar 26, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 2 additions & 17 deletions src/gaia/agents/Chaty/prompts.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,7 @@ class Prompts:
"chatglm": "You are ChatGLM3, a large language model trained by Zhipu.AI. Follow the user's instructions carefully. Respond using markdown.",
"gemma": "You are Gemma, a helpful AI assistant. You provide clear, accurate, and technically-sound responses while maintaining a friendly demeanor.",
"deepseek": "You are DeepSeek R1, a large language model trained by DeepSeek. You provide clear, accurate, and technically-sound responses while maintaining a friendly demeanor.",
"qwen": "You are Qwen, a helpful AI assistant. You provide clear, accurate, and technically-sound responses while maintaining a friendly demeanor.",
"default": "You are a helpful AI assistant. You provide clear, accurate, and technically-sound responses while maintaining a friendly demeanor.",
# Add other system messages here...
}
Expand Down Expand Up @@ -259,23 +260,7 @@ def _match_model_name(model: str) -> str:
@classmethod
def get_system_prompt(cls, model: str, chat_history: list[str]) -> str:
"""Get the formatted system prompt for the given model and chat history."""
model_type = cls._match_model_name(model)
format_template = cls.prompt_formats[model_type]
system_message = cls.system_messages[model_type]

# Format system message
prompt = format_template["system"].format(system_message=system_message)

# Format chat history
for message in chat_history:
if message.startswith("user: "):
content = message[6:] # Remove "user: " prefix
prompt += format_template["user"].format(content=content)
elif message.startswith("assistant: "):
content = message[11:] # Remove "assistant: " prefix
prompt += format_template["assistant"].format(content=content)

return prompt
return cls.format_chat_history(model, chat_history)


def main():
Expand Down