Skip to content

Fix for qwen model prompt (#514) #29

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Mar 26, 2025
Merged

Fix for qwen model prompt (#514) #29

merged 3 commits into from
Mar 26, 2025

Conversation

kovtcharov
Copy link
Collaborator

Fix Qwen model prompt formatting:

  • Add Qwen system message to system_messages dictionary
  • Refactor get_system_prompt to use format_chat_history for consistent formatting
  • Fix empty assistant responses and extra newlines in Qwen chat format
  • Fix following error message
[2025-03-24 14:30:04] | ERROR | aiohttp.server.log_exception | web_protocol.py:451 | Error handling request from 127.0.0.1
Traceback (most recent call last):
  File "C:\Users\kalin\miniconda3\envs\gaiaenv3\lib\site-packages\aiohttp\web_protocol.py", line 480, in _handle_request
    resp = await request_handler(request)
  File "C:\Users\kalin\miniconda3\envs\gaiaenv3\lib\site-packages\aiohttp\web_app.py", line 569, in _handle
    return await handler(request)
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\agent.py", line 204, in _on_prompt_received
    response = self.prompt_received(data["prompt"])
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\app.py", line 41, in prompt_received
    response = self.prompt_llm(prompt)
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\app.py", line 31, in prompt_llm
    prompt = Prompts.get_system_prompt(
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\prompts.py", line 264, in get_system_prompt
    system_message = cls.system_messages[model_type]
KeyError: 'qwen'

closes #24

Fix Qwen model prompt formatting:

- Add Qwen system message to system_messages dictionary
- Refactor get_system_prompt to use format_chat_history for consistent
formatting
- Fix empty assistant responses and extra newlines in Qwen chat format
- Fix following error message
```
[2025-03-24 14:30:04] | ERROR | aiohttp.server.log_exception | web_protocol.py:451 | Error handling request from 127.0.0.1
Traceback (most recent call last):
  File "C:\Users\kalin\miniconda3\envs\gaiaenv3\lib\site-packages\aiohttp\web_protocol.py", line 480, in _handle_request
    resp = await request_handler(request)
  File "C:\Users\kalin\miniconda3\envs\gaiaenv3\lib\site-packages\aiohttp\web_app.py", line 569, in _handle
    return await handler(request)
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\agent.py", line 204, in _on_prompt_received
    response = self.prompt_received(data["prompt"])
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\app.py", line 41, in prompt_received
    response = self.prompt_llm(prompt)
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\app.py", line 31, in prompt_llm
    prompt = Prompts.get_system_prompt(
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\prompts.py", line 264, in get_system_prompt
    system_message = cls.system_messages[model_type]
KeyError: 'qwen'
```
@kovtcharov kovtcharov added the bug Something isn't working label Mar 25, 2025
@kovtcharov kovtcharov self-assigned this Mar 25, 2025
@kovtcharov kovtcharov mentioned this pull request Mar 25, 2025
@kovtcharov kovtcharov enabled auto-merge (squash) March 26, 2025 16:23
@kovtcharov kovtcharov merged commit 7d3e77e into main Mar 26, 2025
4 checks passed
@kovtcharov kovtcharov deleted the kalin/qwen branch March 26, 2025 22:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Cant start Qwen-1.5
3 participants