Skip to content

Models that are not accessible are listed... #22

@cavanaug

Description

@cavanaug

If the model is turned off via https://github.com/settings/copilot/features then it wont work, but is still shown as an available model

I disabled gemini-2.0 flash in the ui, but it is still listed as available

$ llm models | sort | grep copilot
Default: github_copilot/gemini-2.5-pro
GitHub Copilot Chat: github_copilot/claude-3.5-sonnet
GitHub Copilot Chat: github_copilot/claude-3.7-sonnet
GitHub Copilot Chat: github_copilot/claude-3.7-sonnet-thought
GitHub Copilot Chat: github_copilot/claude-sonnet-4
GitHub Copilot Chat: github_copilot/gemini-2.0-flash-001
GitHub Copilot Chat: github_copilot/gemini-2.5-pro
GitHub Copilot Chat: github_copilot/gemini-2.5-pro-preview-06-05 (aliases: gemini-2.5-pro)
GitHub Copilot Chat: github_copilot/gpt-3.5-turbo
GitHub Copilot Chat: github_copilot/gpt-3.5-turbo-0613
GitHub Copilot Chat: github_copilot/gpt-4
GitHub Copilot Chat: github_copilot/gpt-4-0125-preview
GitHub Copilot Chat: github_copilot/gpt-4-0613
GitHub Copilot Chat: github_copilot/gpt-4-o-preview
GitHub Copilot Chat: github_copilot/gpt-4.1
GitHub Copilot Chat: github_copilot/gpt-4.1-2025-04-14
GitHub Copilot Chat: github_copilot/gpt-4o
GitHub Copilot Chat: github_copilot/gpt-4o-2024-05-13
GitHub Copilot Chat: github_copilot/gpt-4o-2024-08-06
GitHub Copilot Chat: github_copilot/gpt-4o-2024-11-20
GitHub Copilot Chat: github_copilot/gpt-4o-mini
GitHub Copilot Chat: github_copilot/gpt-4o-mini-2024-07-18
GitHub Copilot Chat: github_copilot/o1
GitHub Copilot Chat: github_copilot/o1-2024-12-17
GitHub Copilot Chat: github_copilot/o3-mini
GitHub Copilot Chat: github_copilot/o3-mini-2025-01-31
GitHub Copilot Chat: github_copilot/o3-mini-paygo
GitHub Copilot Chat: github_copilot/o4-mini
GitHub Copilot Chat: github_copilot/o4-mini-2025-04-16
GitHub Copilot Chat: github_copilot/text-embedding-3-small
GitHub Copilot Chat: github_copilot/text-embedding-3-small-inference
GitHub Copilot Chat: github_copilot/text-embedding-ada-002

$ llm -m github_copilot/gemini-2.0-flash-001 "tell me a joke"
Error: Error code: 400 - {'error': {'message': 'The requested model is not supported.', 'code': 'model_not_supported', 'param': 'model', 'type': 'invalid_request_error'}}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions