OpenAI
OpenAI’s chat completions API (api.openai.com) is the de-facto industry standard. If you have a credit card and an OpenAI account, this is the simplest vendor to set up. The baseURL override also makes this entry usable for any OpenAI-compatible server — LM Studio, Ollama, vLLM, custom proxies — without a separate vendor.
Get credentials
- Sign in at https://platform.openai.com.
- Open API keys in the left sidebar (or go directly to https://platform.openai.com/api-keys).
- Click Create new secret key, give it a name (e.g.
jambonz), copy thesk-...string.
OpenAI only shows the key once at creation time. Store it somewhere durable — you can’t retrieve it later, only revoke and create a new one.
Configure in jambonz
In the portal: Account → LLM Services → + Add LLM Service → OpenAI.
The sk-... secret you copied from the OpenAI dashboard.
Defaults to https://api.openai.com/v1. Override only if you’re routing through an OpenAI-compatible server (LM Studio, Ollama, vLLM, a corporate proxy, etc.).
Click Test to verify. A green result means the key authenticates against /v1/models.
Use in an agent verb
Available Models
See OpenAI’s model catalog for the full list and current pricing. Common picks for voice agents:
Quirks & errors
gpt-5 / o-series models use max_completion_tokens. OpenAI’s reasoning models (o1, o3, o4 family) and gpt-5 require the newer max_completion_tokens parameter instead of legacy max_tokens. jambonz handles this automatically — passing maxTokens in llmOptions is forwarded under the right name based on the model id. No action needed on your end.
401 Unauthorized typically means the key was revoked or copied with whitespace. Regenerate at platform.openai.com/api-keys and update the credential in the portal.
429 Too Many Requests means you’ve hit a rate limit or your account’s monthly quota. Add a payment method or upgrade tier at platform.openai.com/account/billing.
Pointing jambonz at a self-hosted OpenAI-compatible server? Set the Base URL field to your endpoint (e.g. http://localhost:11434/v1 for Ollama). The API key field still has to be non-empty even if your server doesn’t enforce auth — pass any string.