Anthropic (Claude)
Sinaptic® DROID+ supports Anthropic's Claude models natively, including Claude Opus, Sonnet, and Haiku.
Setup
- Get an API key from console.anthropic.com
- Add it to your environment or
.envfile:
export ANTHROPIC_API_KEY=sk-ant-...
- Configure in
droid.yaml:
anthropic:
base_url: "https://api.anthropic.com"
api_key: "${ANTHROPIC_API_KEY}"
Agent Config
To use a Claude model, set the provider to anthropic in your agent config:
name: "claude-agent"
model:
provider: "anthropic"
name: "claude-sonnet-4-20250514"
max_tokens: 4096
temperature: 0.7
Available Models
| Model | Best for | Context |
|---|---|---|
claude-opus-4-20250514 | Most capable, complex tasks | 200K |
claude-sonnet-4-20250514 | Balanced performance/cost | 200K |
claude-haiku-3-5-20241022 | Fast, cost-effective | 200K |
Check Anthropic's docs for the latest models.
How It Works
Sinaptic® DROID+ translates the OpenAI-compatible request format to Anthropic's Messages API internally. Your clients still use the standard OpenAI SDK — Sinaptic® DROID+ handles the conversion:
# Client code is the same regardless of backend model
from openai import OpenAI
client = OpenAI(base_url="http://localhost:8080/v1", api_key="any")
response = client.chat.completions.create(
model="claude-agent", # Agent name, not the model name
messages=[{"role": "user", "content": "Hello!"}]
)
Notes
- Anthropic models support tool use (function calling) natively. All Sinaptic® DROID+ tools (built-in, REST API, MCP) work with Claude models.
- Claude models have different token pricing than OpenAI. Check Anthropic's pricing for details.
- The
temperatureparameter works the same way, though Anthropic models may behave slightly differently at the same temperature value.