aboutsummaryrefslogtreecommitdiffstats
path: root/README.md
diff options
context:
space:
mode:
authorLouis Burda <dev@sinitax.com>2026-02-02 08:10:56 +0100
committerLouis Burda <dev@sinitax.com>2026-02-02 08:11:17 +0100
commitd69c5b355c450e2c79b62b8a1a7946f375ac207d (patch)
treea20cc4b977e400b2cd08b25f5ea9581156524356 /README.md
parent43ddca6e4de9ed2b8615dedd9a31ee42881fdcb5 (diff)
downloadmultillm-main.tar.gz
multillm-main.zip
Add agentwrap provider and allow tools for singleHEADmain
Diffstat (limited to 'README.md')
-rw-r--r--README.md422
1 files changed, 376 insertions, 46 deletions
diff --git a/README.md b/README.md
index 6a374a1..7384718 100644
--- a/README.md
+++ b/README.md
@@ -1,6 +1,129 @@
# multillm
-A unified async interface for multiple LLM providers. Switch between providers with a single line change.
+A unified async interface for multiple LLM providers with **agentic capabilities**. Switch between providers with a single line change.
+
+## Features
+
+- 🔄 **Unified API** - Same interface for OpenAI, Anthropic, Google, and more
+- 🤖 **Agentic by default** - Automatic tool execution and multi-turn conversations
+- 🛠️ **Interactive tools** - AI can ask you questions during execution
+- 📦 **Provider flexibility** - Switch providers without changing code
+- 🎯 **Simple CLI** - Quick testing and experimentation
+- ⚡ **Async-first** - Built on asyncio for performance
+
+## Quick Start
+
+### CLI
+
+```bash
+# Install
+pip install multillm-cli multillm-openai
+
+# Simple query
+multillm -m openai/gpt-4o -p "What is 2+2?"
+
+# With interactive tools
+multillm -m openai/gpt-4o -p "Ask me about my preferences" --use-tools ask_user
+
+# With other tools
+multillm -m openai/gpt-4o -p "What's the weather in Tokyo?" --use-tools get_weather
+```
+
+See [multillm-cli](packages/multillm-cli) for full CLI documentation.
+
+### Python API
+
+```bash
+pip install multillm multillm-openai multillm-anthropic
+```
+
+**Simple query:**
+```python
+import asyncio
+import multillm
+
+async def main():
+ client = multillm.Client()
+
+ # Agentic API - works with any provider
+ async for msg in client.run("agentwrap/openai/gpt-4o", "Hello!"):
+ if msg.type == "text":
+ print(msg.content)
+
+asyncio.run(main())
+```
+
+**With tools:**
+```python
+import asyncio
+import multillm
+
+# Define a tool
+calculate = multillm.Tool(
+ name="calculate",
+ description="Perform a calculation",
+ parameters={
+ "type": "object",
+ "properties": {
+ "expression": {"type": "string"}
+ },
+ "required": ["expression"]
+ },
+ handler=lambda args: {"result": eval(args["expression"])}
+)
+
+async def main():
+ client = multillm.Client()
+
+ # AI can use tools automatically
+ async for msg in client.run(
+ "agentwrap/openai/gpt-4o",
+ "What's 25 * 4?",
+ tools=[calculate]
+ ):
+ if msg.type == "text":
+ print(msg.content)
+ elif msg.type == "tool_use":
+ print(f"Using tool: {msg.tool_name}")
+
+asyncio.run(main())
+```
+
+**Interactive tools:**
+```python
+import asyncio
+import multillm
+
+# Define interactive tool
+ask_user = multillm.Tool(
+ name="ask_user",
+ description="Ask the user a question",
+ parameters={
+ "type": "object",
+ "properties": {
+ "question": {"type": "string"}
+ },
+ "required": ["question"]
+ },
+ handler=lambda args: {
+ "answer": input(f"\n{args['question']}\nYour answer: ")
+ }
+)
+
+async def main():
+ client = multillm.Client()
+
+ # AI can ask you questions!
+ async for msg in client.run(
+ "agentwrap/openai/gpt-4o",
+ "Help me plan a project by asking about my requirements",
+ tools=[ask_user]
+ ):
+ if msg.type == "text":
+ print(msg.content)
+
+asyncio.run(main())
+```
## Packages
@@ -8,89 +131,296 @@ A unified async interface for multiple LLM providers. Switch between providers w
|---------|-------------|
| [multillm](packages/multillm) | Core library with unified client |
| [multillm-cli](packages/multillm-cli) | Command-line interface |
-| [multillm-anthropic](packages/multillm-anthropic) | Anthropic Claude API provider |
-| [multillm-openai](packages/multillm-openai) | OpenAI API provider |
-| [multillm-gemini](packages/multillm-gemini) | Google Gemini API provider |
-| [multillm-openrouter](packages/multillm-openrouter) | OpenRouter API provider |
-| [multillm-claude](packages/multillm-claude) | Claude Agent SDK provider |
+| **Chat Providers** | |
+| [multillm-openai](packages/multillm-openai) | OpenAI GPT models |
+| [multillm-anthropic](packages/multillm-anthropic) | Anthropic Claude chat API |
+| [multillm-gemini](packages/multillm-gemini) | Google Gemini |
+| [multillm-openrouter](packages/multillm-openrouter) | OpenRouter (access to 100+ models) |
+| **Agent Providers** | |
+| [multillm-agentwrap](packages/multillm-agentwrap) | Wrap chat providers with agentic capabilities |
+| [multillm-claude](packages/multillm-claude) | Claude native agent with built-in tools |
-## Quick Start
+## How It Works
-### CLI
+### The Agentic API
+
+All providers use the same **agentic API** powered by `run()`:
+
+```python
+async for msg in client.run(model, prompt, tools=tools):
+ # Process messages
+```
+
+**Message types:**
+- `system` - Session started
+- `text` - Text response from AI
+- `tool_use` - AI is calling a tool
+- `tool_result` - Tool execution result
+- `result` - Final result
+
+### Provider Format
+
+**Chat providers with agentwrap:**
+```python
+"agentwrap/openai/gpt-4o"
+"agentwrap/google/gemini-pro"
+"agentwrap/anthropic/claude-3-5-sonnet-20241022"
+```
+
+**Native agent providers:**
+```python
+"claude/default"
+"claude/claude-sonnet-4-20250514"
+```
+
+### What is agentwrap?
+
+`agentwrap` wraps standard chat providers (OpenAI, Google, etc.) with agentic capabilities:
+- ✅ Automatic tool execution
+- ✅ Multi-turn conversations
+- ✅ Tool calling loop
+- ✅ Conversation history management
+
+This means **any chat model** can work like an agent!
+
+## Interactive Tools
+
+AI models can ask you questions during execution:
+**CLI:**
```bash
-pip install multillm-cli multillm-openai multillm-anthropic
-multillm -m openai/gpt-4o -p "What is 2+2?"
+# Chat providers
+multillm -m openai/gpt-4o -p "Ask me about my project" --use-tools ask_user
+
+# Claude agent
+multillm -m claude/default -p "Ask me about my project" \
+ --allowed-tools AskUserQuestion --permission-mode acceptEdits
```
-See [multillm-cli](packages/multillm-cli) for more details.
+**Python:**
+```python
+ask_user_tool = multillm.Tool(
+ name="ask_user",
+ description="Ask the user a question",
+ parameters={"type": "object", "properties": {"question": {"type": "string"}}},
+ handler=lambda args: {"answer": input(f"{args['question']}\nYour answer: ")}
+)
-### Python API
+async for msg in client.run("agentwrap/openai/gpt-4o", "Ask me questions", tools=[ask_user_tool]):
+ if msg.type == "text":
+ print(msg.content)
+```
+
+When the AI calls the tool, you'll see:
+```
+======================================================================
+❓ QUESTION FROM ASSISTANT
+======================================================================
+
+What is your favorite programming language?
+
+Your answer: _
+```
+
+See [CLI documentation](packages/multillm-cli) for more interactive tool examples.
+
+## Configuration
+
+### Environment Variables
```bash
-pip install multillm multillm-openai multillm-anthropic # Install core + providers
+export OPENAI_API_KEY=sk-...
+export ANTHROPIC_API_KEY=sk-ant-...
+export GOOGLE_API_KEY=...
```
+### Programmatic
+
+```python
+client = multillm.Client(config={
+ "openai": {"api_key": "sk-..."},
+ "anthropic": {"api_key": "sk-ant-..."},
+})
+```
+
+### Config Files
+
+Create `~/.config/multillm/providers/<provider>.json`:
+
+```json
+{
+ "api_key": "sk-..."
+}
+```
+
+See provider-specific documentation for all options:
+- [OpenAI configuration](packages/multillm-openai)
+- [Anthropic configuration](packages/multillm-anthropic)
+- [Google Gemini configuration](packages/multillm-gemini)
+- [Claude Agent configuration](packages/multillm-claude)
+
+## Examples
+
+### Chat with Different Providers
+
+```python
+import asyncio
+import multillm
+
+async def chat(model: str, prompt: str):
+ client = multillm.Client()
+ async for msg in client.run(model, prompt):
+ if msg.type == "text":
+ print(msg.content)
+
+# All use the same API!
+asyncio.run(chat("agentwrap/openai/gpt-4o", "Hello"))
+asyncio.run(chat("agentwrap/google/gemini-pro", "Hello"))
+asyncio.run(chat("agentwrap/anthropic/claude-3-5-sonnet-20241022", "Hello"))
+asyncio.run(chat("claude/default", "Hello"))
+```
+
+### Custom Tools
+
```python
import asyncio
import multillm
+from datetime import datetime
+
+# Define custom tools
+get_time = multillm.Tool(
+ name="get_current_time",
+ description="Get the current time",
+ parameters={"type": "object", "properties": {}},
+ handler=lambda args: {"time": datetime.now().isoformat()}
+)
+
+weather = multillm.Tool(
+ name="get_weather",
+ description="Get weather for a location",
+ parameters={
+ "type": "object",
+ "properties": {"location": {"type": "string"}},
+ "required": ["location"]
+ },
+ handler=lambda args: {"temp": 72, "condition": "sunny"}
+)
async def main():
client = multillm.Client()
- # Chat completion
- answer = await client.single("openai/gpt-4o", "What is 2+2?")
- print(answer)
+ async for msg in client.run(
+ "agentwrap/openai/gpt-4o",
+ "What time is it and what's the weather in Tokyo?",
+ tools=[get_time, weather]
+ ):
+ if msg.type == "text":
+ print(msg.content)
- # Switch provider with one line change
- answer = await client.single("anthropic/claude-sonnet-4-20250514", "What is 2+2?")
- print(answer)
+asyncio.run(main())
+```
- # Agent with tools
- answer = await client.single(
- "claude/default",
- "What files are in the current directory?",
- allowed_tools=["Bash"],
- permission_mode="acceptEdits",
+### Agent Options
+
+```python
+import asyncio
+import multillm
+
+async def main():
+ client = multillm.Client()
+
+ options = multillm.AgentOptions(
+ max_turns=10, # Max tool execution iterations
+ extra={
+ "temperature": 0.7,
+ "max_tokens": 2000
+ }
)
- print(answer)
+
+ async for msg in client.run(
+ "agentwrap/openai/gpt-4o",
+ "Complex task requiring multiple steps",
+ options=options
+ ):
+ if msg.type == "text":
+ print(msg.content)
asyncio.run(main())
```
-## Provider Types
+## Claude Native Agent
-- **Chat providers**: `anthropic`, `openai`, `gemini`, `openrouter` - Standard request/response
-- **Agent providers**: `claude` - Autonomous agents with tool use
+Claude has a native agent provider with built-in tools:
-## Configuration
+```python
+import asyncio
+import multillm
+
+async def main():
+ client = multillm.Client()
+
+ async for msg in client.run(
+ "claude/default",
+ "List Python files in current directory",
+ options=multillm.AgentOptions(
+ allowed_tools=["Bash", "Glob"],
+ permission_mode="acceptEdits",
+ max_turns=5
+ )
+ ):
+ if msg.type == "text":
+ print(msg.content)
-Providers can be configured via:
-1. Direct config passed to `Client(config={...})`
-2. Environment variables (`OPENAI_API_KEY`, etc.)
-3. Config files at `~/.config/multillm/providers/<provider>.json`
+asyncio.run(main())
+```
-See individual provider packages for configuration details:
-- [multillm-openai](packages/multillm-openai)
-- [multillm-anthropic](packages/multillm-anthropic)
-- [multillm-gemini](packages/multillm-gemini)
-- [multillm-openrouter](packages/multillm-openrouter)
-- [multillm-claude](packages/multillm-claude)
+**Built-in tools:** Bash, Read, Write, Edit, Glob, Grep, Task, WebFetch, WebSearch, and more.
+
+See [Claude Agent documentation](packages/multillm-claude) for details.
## Development
-This is a uv workspace. To set up:
+This is a uv workspace:
```bash
+# Install
uv sync
-uv run python examples/chat-api.py
-uv run python examples/chat-agent.py
+
+# Run examples
+uv run python examples/test-agentwrap.py
+uv run python examples/test-interactive-tools.py
+
+# Run CLI
+uv run multillm -m openai/gpt-4o -p "Hello"
```
-## Examples
+## Documentation
+
+- [Getting Started Guide](INTERFACE_CONCEPTS.md) - Understand the API design
+- [CLI Documentation](packages/multillm-cli/README.md) - Command-line usage
+- [Agentwrap Provider](packages/multillm-agentwrap/README.md) - Wrapping chat models
+- [Claude Agent Provider](packages/multillm-claude/README.md) - Native agent capabilities
+- [Interactive Tools Guide](INTERACTIVE_TOOLS_IMPLEMENTATION.md) - Building interactive agents
+- [Migration Guide](MIGRATION_SINGLE_TO_AGENTWRAP.md) - Updating from older versions
+
+## Migration from single()
+
+If you're using the deprecated `single()` API:
+
+**Old:**
+```python
+result = await client.single("openai/gpt-4o", "Hello")
+print(result.text)
+```
+
+**New:**
+```python
+async for msg in client.run("agentwrap/openai/gpt-4o", "Hello"):
+ if msg.type == "text":
+ print(msg.content)
+```
-- [examples/chat-api.py](examples/chat-api.py) - Interactive chat with chat providers
-- [examples/chat-agent.py](examples/chat-agent.py) - Interactive chat with Claude agent
+See [Migration Guide](MIGRATION_SINGLE_TO_AGENTWRAP.md) for details.
## License