Skip to content

Allow custom Langchain LLM object to be passed in create_coding_agent #1

@nb-programmer

Description

@nb-programmer

When solving a task or creating the agent directly, instead of using the openrouter instantiation, allow llm to be passed as an argument. If not, use openrouter as default.

def create_coding_agent(
working_directory: str,
system_prompt: Optional[str] = None,
system_prompt_path: Optional[str] = None,
model: str = None,
project_prompt: Optional[str] = None,
with_packages: Optional[List[str]] = None,
task_content: Optional[str] = None,
task_basename: Optional[str] = None,
api_key: Optional[str] = None,
todo: bool = False,
verbose: bool = False,
):

this way, we can pass an llm model ourselves. Example:

import agentic_python_coder as coder

from langchain.chat_models import init_chat_model

llm = init_chat_model("openai:gpt-4o-mini")

agent = coder.create_coding_agent(
    working_directory=working_dir,
    model=llm,  # Custom llm model from langchain
    task_content="...",
    task_basename=None,
    todo=True,
    verbose=True,
)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions