Multi-Provider Support
Target OpenAI, xAI, Google Gemini, Anthropic, or an OpenAI-compatible gateway from the same CLI.
Local coding agent for skills, commands, agents, and multi-domain workflows
Target OpenAI, xAI, Google Gemini, Anthropic, or an OpenAI-compatible gateway from the same CLI.
Install reusable project-local skills and slash-command presets from local sources or GitHub catalogs.
Run multiple model-requested tool calls concurrently with a configurable worker limit.
Discover project-local sub-agent definitions from `.agents/agents/*.md` and route delegated work through `sub_agent`.
Use the default browser UI for interactive work, or switch to single-prompt runs for headless execution.
Discover project-local MCP servers from `.agents/mcp.json` and expose their tools to the model.
TIP
Running pbi-agent without a command defaults to pbi-agent web, so a bare invocation launches the browser UI.
curl -LsSf https://astral.sh/uv/install.sh | shpowershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"uv tool install pbi-agentWARNING
If this is your first uv tool install, reload your shell before running pbi-agent or the command may not be on your PATH yet.
pbi-agent --api-key "$OPENAI_API_KEY"| Section | What you will find |
|---|---|
| Guide | Installation, provider setup, and architecture overview |
| CLI Reference | Commands, flags, and runtime defaults |
| Tools | The built-in function tools available to the agent |
| Environment Variables | PBI_AGENT_* settings and provider-specific key fallbacks |
| Customization | INSTRUCTIONS.md, AGENTS.md, project skills, sub-agents, and MCP discovery |
pbi-agent is built for local, file-based coding workflows that need more than a bare chat loop. It keeps the runtime small, talks to providers through synchronous HTTP REST requests implemented with Python's standard library, and lets each workspace define its own skills, commands, agents, and MCP integrations.