OpenAI
OpenAI is the default LLM provider in Sinaptic® DROID+. It works out of the box with any OpenAI model including GPT-4o, GPT-4o mini, and o-series reasoning models.
Setup
- Get an API key from platform.openai.com
- Add it to your environment or
.envfile:
export OPENAI_API_KEY=sk-...
- Configure in
droid.yaml(this is the default, so you may not need to change anything):
llm:
provider: "openai"
base_url: "https://api.openai.com/v1"
api_key: "${OPENAI_API_KEY}"
default_model: "gpt-4o-mini"
Agent Config
Use any OpenAI model in your agent config:
name: "my-agent"
model:
name: "gpt-4o-mini" # or gpt-4o, o3-mini, etc.
max_tokens: 1024
temperature: 0.7
Available Models
| Model | Best for | Context |
|---|---|---|
gpt-4o | Complex reasoning, multimodal | 128K |
gpt-4o-mini | Fast, cost-effective (recommended default) | 128K |
o3-mini | Reasoning tasks | 128K |
gpt-4-turbo | Legacy, high quality | 128K |
Check OpenAI's model list for the latest available models.
Overriding Per Agent
Each agent can use a different model by setting model.name in its config. The global default_model in droid.yaml is used when an agent doesn't specify one.
Azure OpenAI
Azure OpenAI uses a different base URL format. Configure it as a custom provider:
llm:
provider: "openai"
base_url: "https://YOUR-RESOURCE.openai.azure.com/openai/deployments/YOUR-DEPLOYMENT/v1"
api_key: "${AZURE_OPENAI_API_KEY}"
default_model: "gpt-4o"
Azure OpenAI is fully supported in Pro and Enterprise editions. In Community Edition, you can use it via the standard OpenAI provider configuration.