-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Breaking Changes: Model Configuration and Environment Variables Update #453
Comments
I added the following in models.json: {
"id": "qwen2.5:latest",
"name": "Qwen 2.5",
"provider": "Ollama",
"providerId": "ollama",
"enabled": true,
"toolCallType": "manual",
"toolCallModel": "phi4"
},
But I still don't see it in the UI. |
Also, this openrouter config did not work: {
"id": "openai/o3-mini",
"name": "O3 Mini",
"provider": "OpenAI Compatible",
"providerId": "openai-compatible",
"enabled": true,
"toolCallType": "native"
} I just see I do have the following defined in my .env OPENAI_COMPATIBLE_API_KEY=REDACTED
OPENAI_COMPATIBLE_API_BASE_URL="https://openrouter.ai/api/v1" |
I've fixed the model configuration issue by:
Please check the description and try it out. Let me know if you have any issues. |
@miurla Just tried, and now it all works 🎉 |
I tried a few other providers and here's the error I am getting with Error in chat: Invalid arguments for tool search: Type validation failed: Value: {"max_results":5,"query":"DeepSeek R1","search_depth":"basic"}. Error message: [ { "code": "invalid_type", "expected": "array", "received": "undefined", "path": [ "include_domains" ], "message": "Required" }, { "code": "invalid_type", "expected": "array", "received": "undefined", "path": [ "exclude_domains" ], "message": "Required" } ] The same query works fine on other two Gemini Models. |
This error is related to the model's tool calling functionality. Gemini model's native tool calling is not yet stable. You can resolve this issue by setting the toolCallType to 'manual' in the model configuration. |
Thanks for the fast response. I did try configuring it as manual. Still the same error. Gemini flash for deep research is the best bang for buck. Hopefully it works out. |
@warlock666 It looks like your error is still related to native tool calling. The cookie storing your model configuration might not have been updated properly. Try this to reset the configuration:
This should force the browser to update the stored configuration and resolve the error. The system might be using cached settings that still have |
Changes Overview
We have made two significant changes that require user action:
Model Configuration Moved to JSON
public/config/models.json
Environment Variable Updates
NEXT_PUBLIC_OLLAMA_MODEL
NEXT_PUBLIC_OLLAMA_TOOL_CALL_MODEL
NEXT_PUBLIC_AZURE_DEPLOYMENT_NAME
NEXT_PUBLIC_OPENAI_COMPATIBLE_MODEL
NEXT_PUBLIC_ENABLE_SAVE_CHAT_HISTORY
→ENABLE_SAVE_CHAT_HISTORY
Required Actions
1. Update Model Configuration
Development Mode
Edit
public/config/models.json
directly with your model configurations:Docker Mode
Using Prebuilt Image
When using the prebuilt image (
ghcr.io/miurla/morphic:latest
), create amodels.json
file alongside your.env.local
and update your docker-compose.yml to mount it:Building Your Own Image
When building your own Docker image from source:
public/config/models.json
directly with your configurations2. Update Environment Variables
NEXT_PUBLIC_OLLAMA_MODEL
NEXT_PUBLIC_OLLAMA_TOOL_CALL_MODEL
NEXT_PUBLIC_AZURE_DEPLOYMENT_NAME
NEXT_PUBLIC_OPENAI_COMPATIBLE_MODEL
NEXT_PUBLIC_ENABLE_SAVE_CHAT_HISTORY
toENABLE_SAVE_CHAT_HISTORY
Documentation
For detailed configuration instructions, please refer to:
Migration Guide
public/config/models.json
directly with your model settingsFor Docker: Create
models.json
alongside.env.local
with your model settings.env.local
by removing the deprecated variablesPlease report any issues you encounter during the migration.
The text was updated successfully, but these errors were encountered: