Skip to content
This repository was archived by the owner on Sep 18, 2025. It is now read-only.

Conversation

jdlms
Copy link
Contributor

@jdlms jdlms commented May 4, 2025

What kind of change does this PR introduce?

[x ] Bugfix
[] Feature
[] Code update (local variables)
[ ] Refactoring (no functional changes, no api changes)
[ ] Build related changes
[ ] CI related changes
[ ] Documentation content changes
[ ] Other... Please describe:

What is the current behavior?

If the user has no config file, and has only set their api key as an environment variable, Anthropic is always being set as the default model. If you print out the viper config after setting OPENAI_API_KEY you will see this:

 {
  "$schema": "./opencode-schema.json",
  "agents": {
    "coder": {
      "model": "claude-3.7-sonnet"
    },
    "task": {
      "model": "claude-3.7-sonnet"
    },
    "title": {
      "model": "claude-3.7-sonnet"
    }
  },

If using an OpenAI api key, this results in the following error because the max_token size is defaulting to Claude's (see #140):

│Message:                                                                                                                                  │
│  POST "https://api.openai.com/v1/chat/completions": 400 Bad Request {                                                                    │
│      "message": "max_tokens is too large: 50000. This model supports at most 32768 completion tokens, whereas you provided 50000.",      │
│      "type": "invalid_request_error",                                                                                                    │
│      "param": "max_tokens",                                                                                                              │
│      "code": "invalid_value"                                                                                                             │
│    }                                                                                                                                     │

It is happening because in config.go, viper.Get(...) returns an interface{}, and comparing that to "" is not reliable.

What is the new behavior?

Using viper.GetString, models are now correctly set based on environment variable api keys and max_tokens is no longer set to 50000 by default.

@jdlms jdlms changed the title Config fix correcting loose viper string check, default model now set… fix: correct loose config string check May 4, 2025
@rekram1-node
Copy link
Contributor

Thank you @jdlms!

Copy link
Collaborator

@kujtimiihoxha kujtimiihoxha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks

@kujtimiihoxha kujtimiihoxha merged commit 88711db into opencode-ai:main May 5, 2025
@dvic dvic mentioned this pull request May 6, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants