A local-first AI agent server powered by GitHub Copilot. No API keys. No cloud setup. Just your GitHub account.
curl -fsSL https://kody-w.github.io/rapp-installer/install.sh | bash
Windows (PowerShell — works on factory Windows 11):
irm https://raw.githubusercontent.com/kody-w/rapp-installer/main/install.ps1 | iexAuto-installs Python 3.11, Git, and GitHub CLI via winget if missing.
Then:
gh auth login # one-time GitHub auth
brainstem # start the server → localhost:7071The brainstem is a Flask server that connects to GitHub Copilot's API for LLM inference. You define a soul (system prompt) and drop in agents (Python tools the LLM can call). That's it.
~/.brainstem/src/rapp_brainstem/
├── brainstem.py # the server
├── soul.md # personality (system prompt)
├── agents/ # auto-discovered tools
│ └── hello_agent.py
├── local_storage.py # local-first storage shim
└── .env # config (model, paths, port)
Any *_agent.py file in your agents directory gets auto-discovered and registered as a tool:
from basic_agent import BasicAgent
class WeatherAgent(BasicAgent):
def __init__(self):
self.name = "Weather"
self.metadata = {
"name": self.name,
"description": "Gets the weather for a city.",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "City name"}
},
"required": ["city"]
}
}
super().__init__()
def perform(self, city="", **kwargs):
return f"It's sunny in {city}!"The chat UI has a Sources panel — paste any GitHub repo URL with an agents/ folder and the brainstem hot-loads them. Missing pip dependencies are auto-installed.
RAPP teaches you the Microsoft AI stack one layer at a time. Start with the brainstem locally, then layer up when you're ready.
The survival basics. The brainstem runs the core agent loop — soul, tool-calling, conversation. Your GitHub Copilot subscription is the AI engine.
What you learn: Python agents, function-calling, prompt engineering, local-first development.
Give your brainstem a cloud body. Deploy to Azure so it's always-on with persistent storage, monitoring, and Azure OpenAI.
# Deploy via script
curl -fsSL https://raw.githubusercontent.com/kody-w/rapp-installer/main/deploy.sh | bashCreates: Function App (Python 3.11), Azure OpenAI (GPT-4o), Storage Account, Application Insights. All Entra ID auth — no API keys.
What you learn: ARM templates, Azure Functions, managed identity, RBAC, Azure OpenAI.
Connect your agent to Teams and M365 Copilot. Import the included Power Platform solution (MSFTAIBASMultiAgentCopilot_*.zip) into Copilot Studio, point it at your Azure Function, and publish.
The same agent logic you tested locally now answers in Microsoft Teams and M365 Copilot across your organization.
What you learn: Copilot Studio, declarative agents, Power Platform solutions, Teams integration, enterprise AI.
All config via .env (see .env.example):
| Variable | Default | Description |
|---|---|---|
GITHUB_TOKEN |
auto-detected via gh |
GitHub PAT or Copilot token |
GITHUB_MODEL |
gpt-4o |
Model (GitHub Models) |
SOUL_PATH |
./soul.md |
Path to your soul file |
AGENTS_PATH |
./agents |
Path to your agents directory |
PORT |
7071 |
Server port |
| Endpoint | Method | Description |
|---|---|---|
/chat |
POST | {"user_input": "...", "conversation_history": [], "session_id": "..."} |
/health |
GET | Status, model, loaded agents, token state |
/login |
POST | Start GitHub device code OAuth flow |
/models |
GET | List available models |
/repos |
GET | List connected agent repos |
- Python 3.11+
- Git
- GitHub account with Copilot access
cd ~/.brainstem/src && git pullrm -rf ~/.brainstem ~/.local/bin/brainstem