Skip to main content

OpenAI

OpenAI is the default LLM provider in Sinaptic® DROID+. It works out of the box with any OpenAI model including GPT-4o, GPT-4o mini, and o-series reasoning models.

Setup

  1. Get an API key from platform.openai.com
  2. Add it to your environment or .env file:
export OPENAI_API_KEY=sk-...
  1. Configure in droid.yaml (this is the default, so you may not need to change anything):
llm:
provider: "openai"
base_url: "https://api.openai.com/v1"
api_key: "${OPENAI_API_KEY}"
default_model: "gpt-4o-mini"

Agent Config

Use any OpenAI model in your agent config:

name: "my-agent"
model:
name: "gpt-4o-mini" # or gpt-4o, o3-mini, etc.
max_tokens: 1024
temperature: 0.7

Available Models

ModelBest forContext
gpt-4oComplex reasoning, multimodal128K
gpt-4o-miniFast, cost-effective (recommended default)128K
o3-miniReasoning tasks128K
gpt-4-turboLegacy, high quality128K

Check OpenAI's model list for the latest available models.

Overriding Per Agent

Each agent can use a different model by setting model.name in its config. The global default_model in droid.yaml is used when an agent doesn't specify one.

Azure OpenAI

Azure OpenAI uses a different base URL format. Configure it as a custom provider:

llm:
provider: "openai"
base_url: "https://YOUR-RESOURCE.openai.azure.com/openai/deployments/YOUR-DEPLOYMENT/v1"
api_key: "${AZURE_OPENAI_API_KEY}"
default_model: "gpt-4o"

Azure OpenAI is fully supported in Pro and Enterprise editions. In Community Edition, you can use it via the standard OpenAI provider configuration.