-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
When solving a task or creating the agent directly, instead of using the openrouter instantiation, allow llm to be passed as an argument. If not, use openrouter as default.
agentic-python-coder/coder/src/agentic_python_coder/agent.py
Lines 32 to 44 in ba9c153
| def create_coding_agent( | |
| working_directory: str, | |
| system_prompt: Optional[str] = None, | |
| system_prompt_path: Optional[str] = None, | |
| model: str = None, | |
| project_prompt: Optional[str] = None, | |
| with_packages: Optional[List[str]] = None, | |
| task_content: Optional[str] = None, | |
| task_basename: Optional[str] = None, | |
| api_key: Optional[str] = None, | |
| todo: bool = False, | |
| verbose: bool = False, | |
| ): |
this way, we can pass an llm model ourselves. Example:
import agentic_python_coder as coder
from langchain.chat_models import init_chat_model
llm = init_chat_model("openai:gpt-4o-mini")
agent = coder.create_coding_agent(
working_directory=working_dir,
model=llm, # Custom llm model from langchain
task_content="...",
task_basename=None,
todo=True,
verbose=True,
)Metadata
Metadata
Assignees
Labels
No labels