Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added base support for requests to aiml #1120

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 1 addition & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,16 +44,6 @@

---

<div align="center">

<a href="vscode:extension/PythagoraTechnologies.gpt-pilot-vs-code" target="_blank"><img src="https://github.com/Pythagora-io/gpt-pilot/assets/10895136/5792143e-77c7-47dd-ad96-6902be1501cd" alt="Pythagora-io%2Fgpt-pilot | Trendshift" style="width: 185px; height: 55px;" width="185" height="55"/></a>

</div>

GPT Pilot is the core technology for the [Pythagora VS Code extension](https://bit.ly/3IeZxp6) that aims to provide **the first real AI developer companion**. Not just an autocomplete or a helper for PR messages but rather a real AI developer that can write full features, debug them, talk to you about issues, ask for review, etc.

---

📫 If you would like to get updates on future releases or just get in touch, join our [Discord server](https://discord.gg/HaqXugmxr9) or you [can add your email here](http://eepurl.com/iD6Mpo). 📬

---
Expand Down Expand Up @@ -97,9 +87,6 @@ If you are interested in our learnings during this project, you can check [our l
- **Python 3.9+**

# 🚦How to start using gpt-pilot?
👉 If you are using VS Code as your IDE, the easiest way to start is by downloading [GPT Pilot VS Code extension](https://bit.ly/3IeZxp6). 👈

Otherwise, you can use the CLI tool.

### If you're new to GPT Pilot:

Expand All @@ -112,7 +99,7 @@ After you have Python and (optionally) PostgreSQL installed, follow these steps:
5. `pip install -r requirements.txt` (install the dependencies)
6. `cp example-config.json config.json` (create `config.json` file)
7. Set your key and other settings in `config.json` file:
- LLM Provider (`openai`, `anthropic` or `groq`) key and endpoints (leave `null` for default) (note that Azure and OpenRouter are suppored via the `openai` setting)
- LLM Provider (`openai`, `aiml`, `anthropic` or `groq`) key and endpoints (leave `null` for default) (note that Azure and OpenRouter are suppored via the `openai` setting)
- Your API key (if `null`, will be read from the environment variables)
- database settings: sqlite is used by default, PostgreSQL should also work
- optionally update `fs.ignore_paths` and add files or folders which shouldn't be tracked by GPT Pilot in workspace, useful to ignore folders created by compilers
Expand Down
1 change: 1 addition & 0 deletions core/config/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,7 @@ class LLMProvider(str, Enum):
GROQ = "groq"
LM_STUDIO = "lm-studio"
AZURE = "azure"
AIML = "aiml"


class UIAdapter(str, Enum):
Expand Down
24 changes: 24 additions & 0 deletions core/llm/aiml_client.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
from httpx import Timeout
from openai import AsyncOpenAI

from core.config import LLMProvider
from core.llm.openai_client import OpenAIClient
from core.log import get_logger

log = get_logger(__name__)


class AIMLClient(OpenAIClient):
provider = LLMProvider.AIML
stream_options = None

def _init_client(self):
self.client = AsyncOpenAI(
api_key=self.config.api_key,
base_url=self.config.base_url,
timeout=Timeout(
max(self.config.connect_timeout, self.config.read_timeout),
connect=self.config.connect_timeout,
read=self.config.read_timeout,
),
)
3 changes: 3 additions & 0 deletions core/llm/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -330,13 +330,16 @@ def for_provider(provider: LLMProvider) -> type["BaseLLMClient"]:
:param provider: Provider to return the client for.
:return: Client class for the specified provider.
"""
from .aiml_client import AIMLClient
from .anthropic_client import AnthropicClient
from .azure_client import AzureClient
from .groq_client import GroqClient
from .openai_client import OpenAIClient

if provider == LLMProvider.OPENAI:
return OpenAIClient
elif provider == LLMProvider.AIML:
return AIMLClient
elif provider == LLMProvider.ANTHROPIC:
return AnthropicClient
elif provider == LLMProvider.GROQ:
Expand Down
9 changes: 8 additions & 1 deletion example-config.json
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,13 @@
"connect_timeout": 60.0,
"read_timeout": 20.0
},
"aiml": {
// Example config for AIML (to access 100+ model - see https://docs.aimlapi.com/api-overview/text-models-llm)
"base_url": "https://api.aimlapi.com/v1",
"api_key": "your-aiml-api-key",
"connect_timeout": 60.0,
"read_timeout": 20.0
},
// Example config for Anthropic (see https://docs.anthropic.com/docs/api-reference)
"anthropic": {
"base_url": "https://api.anthropic.com",
Expand Down Expand Up @@ -86,4 +93,4 @@
// Files larger than 50KB will be ignored, even if they otherwise wouldn't be.
"ignore_size_threshold": 50000
}
}
}